当前位置: X-MOL 学术Comput. Speech Lang › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Transfer fine-tuning of BERT with phrasal paraphrases
Computer Speech & Language ( IF 3.1 ) Pub Date : 2020-10-20 , DOI: 10.1016/j.csl.2020.101164
Yuki Arase , Junichi Tsujii

Sentence pair modelling is defined as the task of identifying the semantic interaction between a sentence pair, i.e., paraphrase and textual entailment identification and semantic similarity measurement. It constitutes a set of crucial tasks for research in the area of natural language understanding. Sentence representation learning is a fundamental technology for sentence pair modelling, where the development of the BERT model realised a breakthrough. We have recently proposed transfer fine-tuning using phrasal paraphrases to allow BERT’s representations to be suitable for semantic equivalence assessment between sentences while maintaining the model size. Herein, we reveal that transfer fine-tuning with simplified feature generation allows us to generate representations that are widely effective across different types of sentence pair modelling tasks. Detailed analysis confirms that our transfer fine-tuning helps the BERT model converge more quickly with a smaller corpus for fine-tuning.



中文翻译:

使用短语释义传输BERT的微调

句对建模被定义为识别句子对之间语义交互的任务,,释义和文本含义识别以及语义相似性度量。它构成了对自然语言理解领域研究的一组关键任务。句子表示学习是句子对建模的基础技术,其中BERT模型的开发取得了突破。我们最近提出了使用短语复述的转移微调,以使BERT的表示适合于句子之间的语义对等评估,同时保持模型的大小。本文中,我们揭示了具有简化特征生成功能的传递微调功能使我们能够生成在不同类型的句子对建模任务中广泛有效的表示形式。

更新日期:2020-10-30
down
wechat
bug