当前位置: X-MOL 学术Neural Process Lett. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Emotion Analysis on Text Using Multiple Kernel Gaussian...
Neural Processing Letters ( IF 2.6 ) Pub Date : 2021-02-20 , DOI: 10.1007/s11063-021-10436-7
S. Angel Deborah , T. T. Mirnalinee , S. Milton Rajendram

The ability to discern human emotions is critical for making chatbox behave like humans. Gaussian Process (GP) is a non-parametric Bayesian modeling and can be used to predict the presence of either a single emotion (single-task GP) or multiple emotions (multi-task GP) in natural language text. Employing multiple kernels in GP can enhance the performance of the emotion analysis tasks. The particular choice of kernel functions determines the properties such as smoothness, length scales, sharpness, and amplitude, drawn from the GP prior. Using a specific kernel may be a source of bias and can be avoided by using different kernels together. The default kernel used with GP is a Radial Basis Function (RBF). It is infinitely differentiable; GP with this function has mean square derivatives of all orders and is thus very smooth. The sharpness which occurs in the midst of the smoothness can be detected using the exponential kernel. The multi-layer perceptron kernel has greater generalization for each training example and is good for extrapolation. Our experiments show that, for learning the presence of a single emotion in a natural language sentence (single-task), multiple kernel GP with the sum of RBF and multi-layer perceptron kernels performs better than single kernel GP. Likewise, for learning the presence of several different emotions in a sentence (multi-task), multiple kernel GP with the sum of RBF, exponential and multi-layer perceptron kernels performs better than single kernel GP. Multiple Kernel Gaussian Process also outperforms Convolutional Neural Network (CNN).



中文翻译:

使用多核高斯对文本进行情感分析...

辨别人类情绪的能力对于使聊天框像人类一样行为至关重要。高斯过程(GP)是非参数贝叶斯建模,可用于预测自然语言文本中单个情绪(单任务GP)或多种情绪(多任务GP)的存在。在GP中使用多个内核可以增强情感分析任务的性能。内核函数的特定选择确定了从GP之前提取的属性,例如平滑度,长度比例,清晰度和幅度。使用特定内核可能会产生偏差,并且可以通过一起使用不同的内核来避免。GP使用的默认内核是径向基函数(RBF)。它是无限可微的。具有此功能的GP具有所有阶的均方平方,因此非常平滑。使用指数核可以检测出在平滑度中间出现的清晰度。多层感知器内核对于每个训练示例都具有更大的概括性,并且适合于外推。我们的实验表明,对于学习自然语言句子(单任务)中单个情感的存在,具有RBF和多层感知器内核之和的多核GP的性能要优于单核GP。同样,为了学习句子中存在几种不同情绪(多任务),具有RBF,指数和多层感知器内核之和的多核GP的性能要优于单核GP。多核高斯过程也优于卷积神经网络(CNN)。多层感知器内核对于每个训练示例都具有更大的概括性,并且适合于外推。我们的实验表明,对于学习自然语言句子(单任务)中单个情感的存在,具有RBF和多层感知器内核之和的多核GP的性能要优于单核GP。同样,为了学习句子中存在几种不同情绪(多任务),具有RBF,指数和多层感知器内核之和的多核GP的性能要优于单核GP。多核高斯过程也优于卷积神经网络(CNN)。多层感知器内核对于每个训练示例都具有更大的概括性,并且适合于外推。我们的实验表明,对于学习自然语言句子(单任务)中单个情感的存在,具有RBF和多层感知器内核之和的多核GP的性能要优于单核GP。同样,为了学习句子中存在几种不同情绪(多任务),具有RBF,指数和多层感知器内核之和的多核GP的性能要优于单核GP。多核高斯过程也优于卷积神经网络(CNN)。为了学习自然语言句子(单任务)中单个情感的存在,具有RBF和多层感知器内核之和的多内核GP比单内核GP更好。同样,为了学习句子中存在几种不同情绪(多任务),具有RBF,指数和多层感知器内核之和的多核GP的性能要优于单核GP。多核高斯过程也优于卷积神经网络(CNN)。为了学习自然语言句子(单任务)中单个情感的存在,具有RBF和多层感知器内核之和的多内核GP比单内核GP更好。同样,为了学习句子中存在几种不同情绪(多任务),具有RBF,指数和多层感知器内核之和的多核GP的性能要优于单核GP。多核高斯过程也优于卷积神经网络(CNN)。

更新日期:2021-02-21
down
wechat
bug