当前位置: X-MOL 学术Decis. Support Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Attentive statement fraud detection: Distinguishing multimodal financial data with fine-grained attention
Decision Support Systems ( IF 6.7 ) Pub Date : 2022-12-17 , DOI: 10.1016/j.dss.2022.113913
Gang Wang , Jingling Ma , Gang Chen

Financial statement fraud caused by listed companies directly jeopardizes the reliability the financial reporting process. Leveraging multimodal information for financial statement fraud detection (FSFD) has recently become of great interest to academic research and industrial applications. Unfortunately, the predictive ability of multimodal information in FSFD remains largely underexplored, particularly the fusion ambiguity embedded in and among multi-modalities. In this study, we propose a novel attention-based multimodal deep learning method, named RCMA, toward an accurate FSFD. RCMA synthesizes a fine-grained attention mechanism including three innovative attention modules, i.e., ratio-aware attention, chapter-aware attention, and modality-aware attention mechanism. The first two attention mechanisms help to liberate the extraordinary predictive power of the financial modality and the textual modality on FSFD, respectively. Moreover, the proposed modality-aware attention mechanism enables better coordination between the two modalities. Furthermore, to ensure effective learning on the attention-based multimodal embedding, we design a novel loss function named Focal and Consistency Loss, or FCL. It considers class-imbalance and modality-consistency simultaneously, to specialize the optimization of FSFD. The experimental results on the real-world dataset show that the proposed RCMA on FSFD task outperformed the state-of-the-art benchmarks. Furthermore, interpretation analysis visualizes the attention weights of different ratio groups, chapters, and modalities from RCMA, and illustrates how these interpretations influence stakeholders' decision process for FSFD.



中文翻译:

注意力报表欺诈检测:用细粒度的注意力区分多模态金融数据

上市公司造成的财务报表舞弊直接危及财务报告过程的可靠性。利用多模态信息进行财务报表欺诈检测(FSFD) 最近引起了学术研究和工业应用的极大兴趣。不幸的是,FSFD 中多模态信息的预测能力在很大程度上仍未得到充分探索,尤其是嵌入在多模态中和多模态之间的融合模糊性。在这项研究中,我们提出了一种新的基于注意力的多模态深度学习方法,称为RCMA,走向准确的 FSFD。RCMA综合了一个细粒度的注意力机制,包括三个创新的注意力模块,即ratio-aware attention、chapter-aware attention和modality-aware attention mechanism。前两种注意力机制分别有助于释放金融模态和文本模态对 FSFD 的非凡预测能力。此外,所提出的模态感知注意机制可以更好地协调两种模态。此外,为了确保有效地学习基于注意力的多模态嵌入,我们设计了一种名为焦点和一致性损失或FCL 的新型损失函数. 它同时考虑类不平衡和模态一致性,专门优化 FSFD。真实世界数据集的实验结果表明,所提出的 RCMA on FSFD 任务优于最先进的基准。此外,解释分析可视化了 RCMA 中不同比率组、章节和模式的注意力权重,并说明了这些解释如何影响利益相关者对 FSFD 的决策过程。

更新日期:2022-12-17
down
wechat
bug