样式: 排序: IF: - GO 导出 标记为已读
-
MAST-GCN: Multi-Scale Adaptive Spatial-Temporal Graph Convolutional Network for EEG-Based Depression Recognition IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-04-24 Haifeng Lu, Zhiyang You, Yi Guo, Xiping Hu
-
CiABL: Completeness-induced Adaptative Broad Learning for Cross-Subject Emotion Recognition with EEG and Eye Movement Signals IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-04-23 Xinrong Gong, C. L. Philip Chen, Bin Hu, Tong Zhang
-
Modeling Category Semantic and Sentiment Knowledge for Aspect-Level Sentiment Analysis IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-04-19 Yuan Wang, Peng Huo, Lingyan Tang, Ning Xiong, Mengting Hu, Qi Yu, Jucheng Yang
-
The EmoPain@Home Dataset: Capturing Pain Level and Activity Recognition for People with Chronic Pain in Their Homes IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-04-18 Temitayo Olugbade, Raffaele Andrea Buono, Kyrill Potapov, Alex Bujorianu, Amanda C de C Williams, Santiago de Ossorno Garcia, Nicolas Gold, Catherine Holloway, Nadia Bianchi-Berthouze
-
Hierarchical Shared Encoder with Task-specific Transformer Layer Selection for Emotion-Cause Pair Extraction IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-04-17 Xinxin Su, Zhen Huang, Yixin Su, Bayu Distiawan Trisedya, Yong Dou, Yunxiang Zhao
-
Evaluation of virtual agents’ hostility in video games IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-04-17 Remi Poivet, Alexandra de Lagarde, Catherine Pelachaud, Malika Auvray
-
CFN-ESA: A Cross-Modal Fusion Network With Emotion-Shift Awareness for Dialogue Emotion Recognition IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-04-16 Jiang Li, Xiaoping Wang, Yingjian Liu, Zhigang Zeng
-
FBSTCNet: A Spatio-Temporal Convolutional Network Integrating Power and Connectivity Features for EEG-Based Emotion Decoding IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-04-05 Weichen Huang, Wenlong Wang, Yuanqing Li, Wei Wu
-
Joint Training on Multiple Datasets With Inconsistent Labeling Criteria for Facial Expression Recognition IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-04-02 Chengyan Yu, Dong Zhang, Wei Zou, Ming Li
-
VAD: A Video Affective Dataset with Danmu IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-03-28 Shangfei Wang, Xin Li, Feiyi Zheng, Jicai Pan, Xuewei Li, Yanan Chang, Zhou'an Zhu, Qiong Li, Jiahe Wang, Yufei Xiao
-
Fusion and Discrimination: A Multimodal Graph Contrastive Learning Framework for Multimodal Sarcasm Detection IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-03-21 Bin Liang, Lin Gui, Yulan He, Erik Cambria, Ruifeng Xu
-
Emotion-Aware Multimodal Fusion for Meme Emotion Detection IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-03-20 Shivam Sharma, Ramaneswaran S, Md. Shad Akhtar, Tanmoy Chakraborty
-
Contrastive Learning based Modality-Invariant Feature Acquisition for Robust Multimodal Emotion Recognition with Missing Modalities IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-03-18 Rui Liu, Haolin Zuo, Zheng Lian, Bjorn W. Schuller, Haizhou Li
-
A Multi-Stage Visual Perception Approach for Image Emotion Analysis IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-03-08 Jicai Pan, Jinqiao Lu, Shangfei Wang
-
Can Large Language Models Assess Personality from Asynchronous Video Interviews? A Comprehensive Evaluation of Validity, Reliability, Fairness, and Rating Patterns IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-03-08 Tianyi Zhang, Antonis Koutsoumpis, Janneke K. Oostrom, Djurre Holtrop, Sina Ghassemi, Reinout E. de Vries
-
Analyzing Continuous-Time and Sentence-Level Annotations for Speech Emotion Recognition IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-03-01 Luz Martinez-Lucas, Wei-Cheng Lin, Carlos Busso
-
GDDN: Graph Domain Disentanglement Network for Generalizable EEG Emotion Recognition IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-02-29 Bianna Chen, C. L. Philip Chen, Tong Zhang
-
Guest Editorial: Ethics in Affective Computing IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-02-29 Jonathan Gratch, Gretchen Greene, Rosalind Picard, Lachlan Urquhart, Michel Valstar
Stunning advances in machine learning are heralding a new era in sensing, interpreting, simulating and stimulating human emotion. In the human sciences, research is increasingly highlighting the explanatory power of emotions, feelings, and other affective processes to predict how we think and behave. This is beginning to translate into an explosion of applications that can improve human wellbeing including
-
Dep-FER: Facial Expression Recognition in Depressed Patients Based on Voluntary Facial Expression Mimicry IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-02-27 Jiayu Ye, Yanhong Yu, Yunshao Zheng, Yang Liu, Qingxiang Wang
-
Vesper: A Compact and Effective Pretrained Model for Speech Emotion Recognition IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-02-26 Weidong Chen, Xiaofen Xing, Peihao Chen, Xiangmin Xu
-
An Analysis of Physiological and Psychological Responses in Virtual Reality and Flat Screen Gaming IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-02-22 Ritik Vatsal, Shrivatsa Mishra, Rushil Thareja, Mrinmoy Chakrabarty, Ojaswa Sharma, Jainendra Shukla
-
Continuous Emotion Ambiguity Prediction: Modeling with Beta Distributions IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-02-20 Deboshree Bose, Vidhyasaharan Sethu, Eliathamby Ambikairajah
-
Facial Action Unit Detection and Intensity Estimation from Self-supervised Representation IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-02-19 Bowen Ma, Rudong An, Wei Zhang, Yu Ding, Zeng Zhao, Rongsheng Zhang, Tangjie Lv, Changjie Fan, Zhipeng Hu
-
Cross-Task Inconsistency Based Active Learning (CTIAL) for Emotion Recognition IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-02-16 Yifan Xu, Xue Jiang, Dongrui Wu
-
Bodily Sensation Map vs. Bodily Motion Map: Visualizing and Analyzing Emotional Body Motions IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-02-14 Myeongul Jung, Youngwug Cho, Jejoong Kim, Hyungsook Kim, Kwanguk Kim
-
Looking into Gait for Perceiving Emotions via Bilateral Posture and Movement Graph Convolutional Networks IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-02-13 Yingjie Zhai, Guoli Jia, Yu-Kun Lai, Jing Zhang, Jufeng Yang, Dacheng Tao
-
Multi-Modal Hierarchical Empathetic Framework for Social Robots With Affective Body Control IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-02-12 Yue Gao, Yangqing Fu, Ming Sun, Feng Gao
-
Avatar-Based Feedback in Job Interview Training Impacts Action Identities and Anxiety IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-02-08 Sarinasadat Hosseini, Jingyu Quan, Xiaoqi Deng, Yoshihiro Miyake, Takayuki Nozawa
-
Novel VR-Based Biofeedback Systems: A Comparison Between Heart Rate Variability- and Electrodermal Activity-Driven Approaches IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-02-08 Andrea Baldini, Elisabetta Patron, Claudio Gentili, Enzo Pasquale Scilingo, Alberto Greco
-
An Open-source Benchmark of Deep Learning Models for Audio-visual Apparent and Self-reported Personality Recognition IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-02-08 Rongfan Liao, Siyang Song, Hatice Gunes
-
Emotion Recognition in Conversation Based on a Dynamic Complementary Graph Convolutional Network IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-02-01 Zhenyu Yang, Xiaoyang Li, Yuhu Cheng, Tong Zhang, Xuesong Wang
-
Learning With Rater-Expanded Label Space to Improve Speech Emotion Recognition IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-01-31 Shreya G. Upadhyay, Woan-Shiuan Chien, Bo-Hao Su, Chi-Chun Lee
-
A Multi-Level Alignment and Cross-Modal Unified Semantic Graph Refinement Network for Conversational Emotion Recognition IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-01-31 Xiaoheng Zhang, Weigang Cui, Bin Hu, Yang Li
-
Modeling the Interplay Between Cohesion Dimensions: a Challenge for Group Affective Emergent States IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-01-30 Lucien Maman, Nale Lehmann- Willenbrock, Mohamed Chetouani, Laurence Likforman-Sulem, Giovanna Varni
-
Exploring Retrospective Annotation in Long-videos for Emotion Recognition IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-01-29 Patricia Bota, Pablo Cesar, Ana Fred, Hugo Placido da Silva
-
CFDA-CSF: A Multi-modal Domain Adaptation Method for Cross-subject Emotion Recognition IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-01-23 Magdiel Jiménez-Guarneros, Gibran Fuentes-Pineda
-
Show me How You Use Your Mouse and I Tell You How You Feel? Sensing Affect with the Computer Mouse IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-01-23 Paul Freihaut, Anja S. Göritz
-
How Virtual Reality Therapy Affects Refugees from Ukraine - acute stress reduction pilot study IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-01-10 Dorota Kamińska, Grzegorz Zwoliński, Dorota Merecz-Kot
-
Anthropomorphism and Affective Perception: Dimensions, Measurements, and Interdependencies in Aerial Robotics IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-01-04 Viviane Herdel, Anastasia Kuzminykh, Yisrael Parmet, Jessica R. Cauchard
-
Gusa: Graph-based Unsupervised Subdomain Adaptation for Cross-Subject EEG Emotion Recognition IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-01-04 Xiaojun Li, C.L. Philip Chen, Bianna Chen, Tong Zhang
-
MASANet: Multi-Aspect Semantic Auxiliary Network for Visual Sentiment Analysis IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2024-01-04 Jinglun Cen, Chunmei Qing, Haochun Ou, Xiangmin Xu, Junpeng Tan
-
Editorial: Special Issue on Unobtrusive Physiological Measurement Methods for Affective Applications IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2023-11-28 Ioannis T. Pavlidis, Theodora Chaspari, Daniel McDuff
In The formative years of Affective Computing [1], from the late 1990s and into the early 2000s, a significant fraction of research attention was focused on the development of methods for unobtrusive physiological measurement . It quickly became obvious that wiring people with electrodes and strapping cumbersome hardware to their bodies was not only restricting the types of experiments that could be
-
Automatic Deceit Detection Through Multimodal Analysis of High-Stake Court-Trials IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2023-10-05 Berat Biçer, Hamdi Dibeklioğlu
In this article we propose the use of convolutional self-attention for attention-based representation learning, while replacing traditional vectorization methods with a transformer as the backbone of our speech model for transfer learning within our automatic deceit detection framework. This design performs a multimodal data analysis and applies fusion to merge visual, vocal, and speech(textual) channels;
-
Guest Editorial Neurosymbolic AI for Sentiment Analysis IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2023-09-18 Frank Xing, Björn Schuller, Iti Chaturvedi, Erik Cambria, Amir Hussain
Neural network-based methods, especially deep learning, have been a burgeoning area in AI research and have been successful in tackling the expanding data volume as we move into a digital age. Today, the neural network-based methods are not only used for low-level cognitive tasks, such as recognizing objects and spotting keywords, but they have also been deployed in various industrial information systems
-
Spatio-Temporal Graph Analytics on Secondary Affect Data for Improving Trustworthy Emotional AI IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2023-07-20 Md Taufeeq Uddin, Lijun Yin, Shaun Canavan
Ethical affective computing (AC) requires maximizing the benefits to users while minimizing its harm to obtain trust from users. This requires responsible development and deployment to ensure fairness, bias mitigation, privacy preservation, and accountability. To obtain this, we require methodologies that can quantify, visualize, analyze, and mine insights from affect data. Hence, in this article,
-
WiFE: WiFi and Vision Based Unobtrusive Emotion Recognition via Gesture and Facial Expression IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2023-06-13 Yu Gu, Xiang Zhang, Huan Yan, Jingyang Huang, Zhi Liu, Mianxiong Dong, Fuji Ren
Emotion plays a critical role in making the computer more human-like. As the first and most essential step, emotion recognition emerges recently as a hot but relatively nascent topic, i.e., current research mainly focuses on single modality (e.g., facial expression) while human emotion expressions are multi-modal in nature. To this end, we propose an unobtrusive emotion recognition system leveraging
-
Two Birds With One Stone: Knowledge-Embedded Temporal Convolutional Transformer for Depression Detection and Emotion Recognition IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2023-06-05 Wenbo Zheng, Lan Yan, Fei-Yue Wang
Depression is a critical problem in modern society that affects an estimated 350 million people worldwide, causing feelings of sadness and a lack of interest and pleasure. Emotional disorders are gaining interest and are closely entwined with depression, because one contributes to an understanding of the other. Despite the achievements in the two separate tasks of emotion recognition and depression
-
An (E)Affective Bind: Situated Affectivity and the Prospect of Affect Recognition IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2023-05-29 Jason Branford
Several prominent criticisms have recently challenged the possibility of algorithmically determining or recognising human affect. This paper ethically evaluates one underexplored avenue for overcoming such deficiencies in categorical affect recognition technologies (ARTs). Specifically, the emerging literature on “situated affectivity” offers valuable guidance on three fronts. First, it conceptually
-
Opacity, Transparency, and the Ethics of Affective Computing IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2023-05-23 Manohar Kumar, Aisha Aijaz, Omkar Chattar, Jainendra Shukla, Raghava Mutharaju
Human opacity is the intrinsic quality of unknowability of human beings with respect to machines. The descriptive relationship between humans and machines, which captures how much information one can gather about the other, can be explicated using an opacity-transparency relationship. This relationship allows us to describe and normatively evaluate a spectrum of opacity where humans and machines may
-
A Quantum Probability Driven Framework for Joint Multi-Modal Sarcasm, Sentiment and Emotion Analysis IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2023-05-23 Yaochen Liu, Yazhou Zhang, Dawei Song
Sarcasm, sentiment, and emotion are three typical kinds of spontaneous affective responses of humans to external events and they are tightly intertwined with each other. Such events may be expressed in multiple modalities (e.g., linguistic, visual and acoustic), e.g., multi-modal conversations. Joint analysis of humans’ multi-modal sarcasm, sentiment, and emotion is an important yet challenging topic
-
The Ethics of AI in Games IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2023-05-16 David Melhart, Julian Togelius, Benedikte Mikkelsen, Christoffer Holmgård, Georgios N. Yannakakis
Video games are one of the richest and most popular forms of human-computer interaction and, hence, their role is critical for our understanding of human behaviour and affect at a large scale. As artificial intelligence (AI) tools are gradually adopted by the game industry a series of ethical concerns arise. Such concerns, however, have so far not been extensively discussed in a video game context
-
Facial Expression Recognition in Classrooms: Ethical Considerations and Proposed Guidelines for Affect Detection in Educational Settings IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2023-05-15 Allison Macey Banzon, Jonathan Beever, Michelle Taub
Recent technological and educational shifts have made it possible to capture students’ facial expressions during learning with the goal of detecting learners’ emotional states. Those interested in affect detection argue these tools will support automated emotions-based learning interventions, providing educational professionals with the opportunity to develop individualized, emotionally responsive
-
Efficient Multimodal Transformer With Dual-Level Feature Restoration for Robust Multimodal Sentiment Analysis IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2023-05-10 Licai Sun, Zheng Lian, Bin Liu, Jianhua Tao
With the proliferation of user-generated online videos, Multimodal Sentiment Analysis (MSA) has attracted increasing attention recently. Despite significant progress, there are still two major challenges on the way towards robust MSA: 1) inefficiency when modeling cross-modal interactions in unaligned multimodal data; and 2) vulnerability to random modality feature missing which typically occurs in
-
Crowdsourcing Affective Annotations Via fNIRS-BCI IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2023-05-08 Tuukka Ruotsalo, Kalle Mäkelä, Michiel Spapé
Affective annotation refers to the process of labeling media content based on the emotions they evoke. Since such experiences are inherently subjective and depend on individual differences, the central challenge is associating digital content with its affective, interindividual experience. Here, we present a first-of-its-kind methodology for affective annotation directly from brain signals by monitoring
-
A Review of Tools and Methods for Detection, Analysis, and Prediction of Allostatic Load Due to Workplace Stress IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2023-05-04 Karl Magtibay, Karthikeyan Umapathy
Chronic stress risks an individual's overall well-being. Chronic stress is associated with allostatic load, the body's wear-and-tear due to prolonged heightened physiological and psychological states. Increased allostatic load among workers increases their risk of injuries and the likelihood of diseases and illnesses. An allostatic load model could explain the basis of a stress response. Stress research
-
WavDepressionNet: Automatic Depression Level Prediction via Raw Speech Signals IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2023-05-03 Mingyue Niu, Jianhua Tao, Yongwei Li, Yong Qin, Ya Li
Physiological reports have confirmed that there are differences in speech signals between depressed and healthy individuals. Therefore, as an application in the field of affective computing, automatic depression level prediction through speech signals has received the attention of researchers, which often estimate the depression severity of individuals by the Fourier or Mel spectrograms of speech signals
-
Empirical Validation of an Agent-Based Model of Emotion Contagion IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2023-05-01 Erik Stefan van Haeringen, Emmeke Anna Veltmeijer, Charlotte Gerritsen
In recent years, many agent-based models of human groups have implemented a mechanism of emotion contagion, yet empirical validation is lagging behind. The aim of the present paper is to validate an agent-based model of emotion contagion at the level of group emotion, by comparing simulations against the emotional development of real people in small groups. To study the effect of emotion contagion
-
The Role of Preprocessing for Word Representation Learning in Affective Tasks IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2023-04-25 Nastaran Babanejad, Heidar Davoudi, Ameeta Agrawal, Aijun An, Manos Papagelis
Affective tasks, including sentiment analysis, emotion classification, and sarcasm detection have drawn a lot of attention in recent years due to a broad range of useful applications in various domains. The main goal of affect detection tasks is to recognize states such as mood, sentiment, and emotions from textual data (e.g., news articles or product reviews). Despite the importance of utilizing preprocessing
-
Pose-Aware Facial Expression Recognition Assisted by Expression Descriptions IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2023-04-17 Shangfei Wang, Yi Wu, Yanan Chang, Guoming Li, Meng Mao
Although expression descriptions provide additional information about facial behaviors despite of different poses, and pose features are beneficial to adapt to pose variety, neither has been fully leveraged in facial expression recognition. This paper proposes a pose-aware text-assisted facial expression recognition method using cross-modality attention. Specifically, the method contains three components
-
Cross-Day Data Diversity Improves Inter-Individual Emotion Commonality of Spatio-Spectral EEG Signatures Using Independent Component Analysis IEEE Trans. Affect. Comput. (IF 11.2) Pub Date : 2023-04-13 Yi-Wei Shen, Yuan-Pin Lin
Electroencephalogram (EEG) variability poses a great challenge to the affective brain-computer interface (aBCI) for practical applications. Most aBCI frameworks have been demonstrated successfully but deliberated on single-day data, which can be realistically susceptible to psychophysiological changes and further hinder the exploration of inter-individual EEG commonality. This study proposes a multiple-day