Abstract:
Aiming at overfitting training data in deep model caused by too few samples, we propose a few-shot learning method that combines knowledge distillation and transfer learning. In order to improve the feature expression ability of shallow network for small sample images, we designed a multi-generation distillation network structure. A modified transfer learning structure was given to enhance the generalization ability of the network by adjusting few parameters. Multiple classifiers were combined to fuse the networks obtained through distillation and transfer. The experiments on three few-shot standard datasets show that the proposed model can effectively improve the classification ability of the model and make the few-shot prediction results more accurate.