当前位置: X-MOL 学术J. Manuf. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
An accurate prediction method of multiple deterioration forms of tool based on multitask learning with low rank tensor constraint
Journal of Manufacturing Systems ( IF 12.1 ) Pub Date : 2021-01-01 , DOI: 10.1016/j.jmsy.2020.11.018
Changqing Liu , Jincheng Ni , Peng Wan

Abstract Tool deterioration is a common issue in Numerical Control (NC) machining, which directly affects part quality, production efficiency and manufacturing cost. Due to the complexity of machining, multiple deterioration forms of tool are involved during the tool deterioration process, which imposes a significant challenge for tool condition prediction because of the coupling effects among different deterioration forms. In order to address this issue, an accurate prediction method of multiple deterioration forms of tool based on multitask learning with low rank tensor constraint is proposed in this paper. A base model for prediction of each deterioration form is firstly constructed by using neural network, and then a 3-order tensor is constructed by stacking the weight matrices of the hidden layers of the base neural networks. The hidden relationship among the multiple related tasks is expected to be learned by means of a mathematical approach, i.e., constraining the 3-order tensor with low rank, which is realized by joint training the base prediction models. The experimental results show that the proposed method can reduce the prediction error by about 13 % for wear and 25 % for chipping respectively compared with the corresponding single task models.

中文翻译:

一种基于低秩张量约束多任务学习的工具多种劣化形式的准确预测方法

摘要 刀具劣化是数控加工中的常见问题,直接影响零件质量、生产效率和制造成本。由于加工的复杂性,刀具劣化过程涉及多种刀具劣化形式,不同劣化形式之间存在耦合效应,对刀具状态预测提出了重大挑战。针对这一问题,本文提出了一种基于低秩张量约束多任务学习的工具多种劣化形式的准确预测方法。首先使用神经网络构建用于预测每种恶化形式的基础模型,然后通过叠加基础神经网络的隐藏层的权重矩阵来构建3阶张量。多个相关任务之间的隐藏关系有望通过数学方法学习,即约束低秩的三阶张量,通过联合训练基础预测模型来实现。实验结果表明,与相应的单任务模型相比,所提出的方法可以将磨损的预测误差分别降低约 13% 和崩刃的预测误差 25%。
更新日期:2021-01-01
down
wechat
bug