当前位置: X-MOL 学术J. Circuits Syst. Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A Compressed Model-Agnostic Meta-Learning Model Based on Pruning for Disease Diagnosis
Journal of Circuits, Systems and Computers ( IF 0.9 ) Pub Date : 2022-08-20 , DOI: 10.1142/s0218126623500226
Xiangjun Hu 1 , Xiuxiu Ding 2 , Dongpeng Bai 1 , Qingchen Zhang 1
Affiliation  

Meta-learning has been widely used in medical image analysis. However, it requires a large amount of storage space and computing resources to train and use neural networks, especially model-agnostic meta-learning (MAML) models, making networks difficult to deploy on embedded systems and low-power devices for smart healthcare. Aiming at this problem, we explore to compress a MAML model with pruning methods for disease diagnosis. First, for each task, we find unimportant and redundant connections in MAML for its classification, respectively. Next, we find common unimportant connections for most tasks with intersections. Finally, we prune the common unimportant connections of the initial network. We conduct some experiments to assess the proposed model by comparison with MAML on Omniglot dataset and MiniImagenet dataset. The results show that our method reduces 40% parameters of the raw models, without incurring accuracy loss, demonstrating the potential of the proposed method for disease diagnosis.



中文翻译:

一种基于修剪的压缩模型不可知元学习模型用于疾病诊断

元学习已广泛应用于医学图像分析。然而,它需要大量的存储空间和计算资源来训练和使用神经网络,尤其是与模型无关的元学习 (MAML) 模型,这使得网络难以部署在用于智能医疗的嵌入式系统和低功耗设备上。针对这个问题,我们探索用修剪方法压缩 MAML 模型用于疾病诊断。首先,对于每个任务,我们分别在 MAML 中为其分类找到不重要和冗余的连接。接下来,我们为大多数具有交叉点的任务找到常见的不重要连接。最后,我们修剪了初始网络的常见不重要连接。我们进行了一些实验,通过与 Omniglot 数据集和 MiniImagenet 数据集上的 MAML 比较来评估所提出的模型。

更新日期:2022-08-20
down
wechat
bug