融合知识蒸馏与迁移学习的小样本学习方法

FEW-SHOT LEARNING BASED ON KNOWLEDGE DISTILLATION AND TRANSFER LEARNING

  • 摘要: 针对样本数量过少易造成深度模型在训练数据上过拟合的问题,提出一种融合知识蒸馏与迁移学习的小样本学习方法。该文设计多代蒸馏网络结构,提高误差网络对小样本图像的特征表达能力;通过改进的迁移学习结构调整少量参数,进一步提升网络的泛化能力;结合多分类器对经过蒸馏和迁移得到的网络进行融合。在三个数据集上的实验结果表明,所提方法能够有效提高模型在新任务上的分类性能,使得小样本预测结果更加准确。

     

    Abstract: Aiming at overfitting training data in deep model caused by too few samples, we propose a few-shot learning method that combines knowledge distillation and transfer learning. In order to improve the feature expression ability of shallow network for small sample images, we designed a multi-generation distillation network structure. A modified transfer learning structure was given to enhance the generalization ability of the network by adjusting few parameters. Multiple classifiers were combined to fuse the networks obtained through distillation and transfer. The experiments on three few-shot standard datasets show that the proposed model can effectively improve the classification ability of the model and make the few-shot prediction results more accurate.

     

/

返回文章
返回