当前位置: X-MOL 学术Artif. Intell. Rev. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Transformer models for text-based emotion detection: a review of BERT-based approaches
Artificial Intelligence Review ( IF 10.7 ) Pub Date : 2021-02-08 , DOI: 10.1007/s10462-021-09958-2
Francisca Adoma Acheampong , Henry Nunoo-Mensah , Wenyu Chen

We cannot overemphasize the essence of contextual information in most natural language processing (NLP) applications. The extraction of context yields significant improvements in many NLP tasks, including emotion recognition from texts. The paper discusses transformer-based models for NLP tasks. It highlights the pros and cons of the identified models. The models discussed include the Generative Pre-training (GPT) and its variants, Transformer-XL, Cross-lingual Language Models (XLM), and the Bidirectional Encoder Representations from Transformers (BERT). Considering BERT’s strength and popularity in text-based emotion detection, the paper discusses recent works in which researchers proposed various BERT-based models. The survey presents its contributions, results, limitations, and datasets used. We have also provided future research directions to encourage research in text-based emotion detection using these models.



中文翻译:

用于基于文本的情绪检测的变压器模型:基于BERT的方法的回顾

我们不能在大多数自然语言处理(NLP)应用程序中过分强调上下文信息的本质。上下文的提取在许多NLP任务中都产生了重大改进,包括从文本进行情感识别。本文讨论了基于变压器的NLP任务模型。它突出显示了已识别模型的优缺点。讨论的模型包括生成式预训练(GPT)及其变体,Transformer-XL,跨语言模型(XLM)和来自变压器的双向编码器表示(BERT)。考虑到BERT在基于文本的情感检测中的优势和受欢迎程度,本文讨论了研究人员提出的各种基于BERT的模型的最新工作。该调查显示了其贡献,结果,局限性和使用的数据集。

更新日期:2021-02-09
down
wechat
bug