The problem of training a classifier from a handful of positive examples, without having to supply class specific negatives is of great practical importance. The proposed approach to solving this problem builds on the idea of training LDA classifiers using only class specific foreground images and a large collection of unlabelled images, as described in [11]. While we adopt the LDA training methodology of [11], we depart from HOG features and work with those extracted from a Convolutional Neural Network (CNN) pre-trained on ImageNet (Overfeat). We combine Overfeat features with the LDA training methodology to derive generative classifiers. When evaluated on a K-way classification problem, these classifiers are almost as good as those trained discriminatively using the same features. Unlike the HOG based approach of [11], our classifiers do not need any post-processing step of calibration, a step that requires positives and negatives. Finally, we show that in an instance retrieval setup, we can employ these generative classifiers to derive a novel query-expansion framework that achieves a significant performance boost by utilizing only the top ranked positive examples from an initial nearest-neighbor list.