当前位置: X-MOL 学术arXiv.cs.LG › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
EventDrop: data augmentation for event-based learning
arXiv - CS - Machine Learning Pub Date : 2021-06-07 , DOI: arxiv-2106.05836
Fuqiang Gu, Weicong Sng, Xuke Hu, Fangwen Yu

The advantages of event-sensing over conventional sensors (e.g., higher dynamic range, lower time latency, and lower power consumption) have spurred research into machine learning for event data. Unsurprisingly, deep learning has emerged as a competitive methodology for learning with event sensors; in typical setups, discrete and asynchronous events are first converted into frame-like tensors on which standard deep networks can be applied. However, over-fitting remains a challenge, particularly since event datasets remain small relative to conventional datasets (e.g., ImageNet). In this paper, we introduce EventDrop, a new method for augmenting asynchronous event data to improve the generalization of deep models. By dropping events selected with various strategies, we are able to increase the diversity of training data (e.g., to simulate various levels of occlusion). From a practical perspective, EventDrop is simple to implement and computationally low-cost. Experiments on two event datasets (N-Caltech101 and N-Cars) demonstrate that EventDrop can significantly improve the generalization performance across a variety of deep networks.

中文翻译:

EventDrop:基于事件学习的数据增强

事件传感相对于传统传感器的优势(例如,更高的动态范围、更低的时间延迟和更低的功耗)激发了对事件数据机器学习的研究。不出所料,深度学习已成为使用事件传感器进行学习的一种竞争方法。在典型的设置中,离散和异步事件首先被转换成可以应用标准深度网络的框架式张量。然而,过度拟合仍然是一个挑战,特别是因为事件数据集相对于传统数据集(例如 ImageNet)来说仍然很小。在本文中,我们介绍了 EventDrop,这是一种增强异步事件数据以提高深度模型泛化的新方法。通过删除使用各种策略选择的事件,我们能够增加训练数据的多样性(例如,以模拟各种级别的遮挡)。从实用的角度来看,EventDrop 实现简单且计算成本低。在两个事件数据集(N-Caltech101 和 N-Cars)上的实验表明,EventDrop 可以显着提高各种深度网络的泛化性能。
更新日期:2021-06-11
down
wechat
bug