当前位置: X-MOL 学术Pattern Recogn. Lett. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Multi-task learning for natural language processing in the 2020s: Where are we going?
Pattern Recognition Letters ( IF 5.1 ) Pub Date : 2020-05-28 , DOI: 10.1016/j.patrec.2020.05.031
Joseph Worsham , Jugal Kalita

Multi-task learning (MTL) significantly pre-dates the deep learning era, and it has seen a resurgence in the past few years as researchers have been applying MTL to deep learning solutions for natural language tasks. While steady MTL research has always been present, there is a growing interest driven by the impressive successes published in the related fields of transfer learning and pre-training, such as BERT, and the release of new challenge problems, such as GLUE and the NLP Decathlon (decaNLP). These efforts place more focus on how weights are shared across networks, evaluate the re-usability of network components and identify use cases where MTL can significantly outperform single-task solutions. This paper strives to provide a comprehensive survey of the numerous recent MTL contributions to the field of natural language processing and provide a forum to focus efforts on the hardest unsolved problems in the next decade. While novel models that improve performance on NLP benchmarks are continually produced, lasting MTL challenges remain unsolved which could hold the key to better language understanding, knowledge discovery and natural language interfaces.



中文翻译:

2020年代用于自然语言处理的多任务学习:我们要去哪里?

多任务学习(MTL)大大早于深度学习时代,并且在过去几年中,随着研究人员将MTL应用于自然语言任务的深度学习解决方案,它已经复苏。尽管一直在进行稳定的MTL研究,但人们对兴趣的增长越来越感兴趣,这是在诸如BERT之类的迁移学习和预培训相关领域中取得的骄人成就以及诸如GLUE和NLP之类的新挑战问题的释放所引起的。迪卡侬(decaNLP)。这些工作更多地集中在如何跨网络共享权重,评估网络组件的可重用性以及确定MTL可以大大胜过单任务解决方案的用例中。本文努力提供对MTL在自然语言处理领域中众多最新贡献的全面调查,并提供一个论坛来集中精力研究下一个十年中最难解决的问题。尽管不断开发出可提高NLP基准性能的新颖模型,但仍未解决持久的MTL挑战,这可能是改善语言理解,知识发现和自然语言界面的关键。

更新日期:2020-05-28
down
wechat
bug