当前位置: X-MOL 学术IEEE Trans. Pattern Anal. Mach. Intell. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Efficient Group-n Encoding and Decoding for Facial Age Estimation
IEEE Transactions on Pattern Analysis and Machine Intelligence ( IF 23.6 ) Pub Date : 2017-12-04 , DOI: 10.1109/tpami.2017.2779808
Zichang Tan , Jun Wan , Zhen Lei , Ruicong Zhi , Guodong Guo , Stan Z. Li

Different ages are closely related especially among the adjacent ages because aging is a slow and extremely non-stationary process with much randomness. To explore the relationship between the real age and its adjacent ages, an age group-n encoding (AGEn) method is proposed in this paper. In our model, adjacent ages are grouped into the same group and each age corresponds to n groups. The ages grouped into the same group would be regarded as an independent class in the training stage. On this basis, the original age estimation problem can be transformed into a series of binary classification sub-problems. And a deep Convolutional Neural Networks (CNN) with multiple classifiers is designed to cope with such sub-problems. Later, a Local Age Decoding (LAD) strategy is further presented to accelerate the prediction process, which locally decodes the estimated age value from ordinal classifiers. Besides, to alleviate the imbalance data learning problem of each classifier, a penalty factor is inserted into the unified objective function to favor the minority class. To compare with state-of-the-art methods, we evaluate the proposed method on FG-NET, MORPH II, CACD and Chalearn LAP 2015 databases and it achieves the best performance.

中文翻译:

面部年龄估计的高效n组编码和解码

不同的年龄密切相关,尤其是在相邻年龄之间,因为衰老是一个缓慢且极其不稳定的过程,具有很大的随机性。为了探讨实际年龄与其相邻年龄之间的关系,提出了一种年龄组n编码(AGEn)方法。在我们的模型中,相邻年龄分组为同一组,每个年龄对应n个组。归入同一组的年龄在培训阶段将被视为独立班级。在此基础上,原始的年龄估计问题可以转化为一系列的二元分类子问题。设计具有多个分类器的深层卷积神经网络(CNN)来解决此类子问题。之后,我们进一步提出了本地年龄解码(LAD)策略,以加快预测过程,从本地分类器本地解码估计的年龄值。另外,为了减轻每个分类器的不平衡数据学习问题,将惩罚因子插入统一目标函数中以偏爱少数类。为了与最新方法进行比较,我们在FG-NET,MORPH II,CACD和Chalearn LAP 2015数据库上评估了所提出的方法,并获得了最佳性能。
更新日期:2018-10-03
down
wechat
bug