当前位置: X-MOL 学术IEEE Trans. Image Process. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Optimized Dual Fire Attention Network and Medium-Scale Fire Classification Benchmark
IEEE Transactions on Image Processing ( IF 10.8 ) Pub Date : 9-21-2022 , DOI: 10.1109/tip.2022.3207006
Hikmat Yar 1 , Tanveer Hussain 1 , Mohit Agarwal 2 , Zulfiqar Ahmad Khan 1 , Suneet Kumar Gupta 2 , Sung Wook Baik 1
Affiliation  

Vision-based fire detection systems have been significantly improved by deep models; however, higher numbers of false alarms and a slow inference speed still hinder their practical applicability in real-world scenarios. For a balanced trade-off between computational cost and accuracy, we introduce dual fire attention network (DFAN) to achieve effective yet efficient fire detection. The first attention mechanism highlights the most important channels from the features of an existing backbone model, yielding significantly emphasized feature maps. Then, a modified spatial attention mechanism is employed to capture spatial details and enhance the discrimination potential of fire and non-fire objects. We further optimize the DFAN for real-world applications by discarding a significant number of extra parameters using a meta-heuristic approach, which yields around 50% higher FPS values. Finally, we contribute a medium-scale challenging fire classification dataset by considering extremely diverse, highly similar fire/non-fire images and imbalanced classes, among many other complexities. The proposed dataset advances the traditional fire detection datasets by considering multiple classes to answer the following question: what is on fire? We perform experiments on four widely used fire detection datasets, and the DFAN provides the best results compared to 21 state-of-the-art methods. Consequently, our research provides a baseline for fire detection over edge devices with higher accuracy and better FPS values, and the proposed dataset extension provides indoor fire classes and a greater number of outdoor fire classes; these contributions can be used in significant future research. Our codes and dataset will be publicly available at https://github.com/tanveer-hussain/DFAN.

中文翻译:


优化的双火灾注意力网络和中型火灾分类基准



基于视觉的火灾探测系统通过深度模型得到了显着改进;然而,较高的误报数量和较慢的推理速度仍然阻碍了它们在现实场景中的实际应用。为了平衡计算成本和准确性之间的权衡,我们引入了双火灾注意网络(DFAN)来实现有效且高效的火灾检测。第一个注意力机制突出了现有骨干模型特征中最重要的通道,产生显着强调的特征图。然后,采用改进的空间注意机制来捕获空间细节并增强火灾和非火灾物体的辨别潜力。我们使用元启发式方法丢弃大量额外参数,进一步针对实际应用优化 DFAN,从而将 FPS 值提高约 50%。最后,我们通过考虑极其多样化、高度相似的火灾/非火灾图像和不平衡的类别以及许多其他复杂性,贡献了一个中等规模的具有挑战性的火灾分类数据集。所提出的数据集通过考虑多个类别来回答以下问题,从而改进了传统的火灾检测数据集:什么着火了?我们对四个广泛使用的火灾探测数据集进行了实验,与 21 种最先进的方法相比,DFAN 提供了最佳结果。因此,我们的研究为边缘设备的火灾检测提供了基线,具有更高的准确度和更好的 FPS 值,并且提出的数据集扩展提供了室内火灾类别和更多数量的室外火灾类别;这些贡献可用于未来的重要研究。我们的代码和数据集将在 https://github.com/tanveer-hussain/DFAN 上公开提供。
更新日期:2024-08-26
down
wechat
bug