当前位置: X-MOL 学术arXiv.cs.PL › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Abstract Neural Networks
arXiv - CS - Programming Languages Pub Date : 2020-09-11 , DOI: arxiv-2009.05660
Matthew Sotoudeh and Aditya V. Thakur

Deep Neural Networks (DNNs) are rapidly being applied to safety-critical domains such as drone and airplane control, motivating techniques for verifying the safety of their behavior. Unfortunately, DNN verification is NP-hard, with current algorithms slowing exponentially with the number of nodes in the DNN. This paper introduces the notion of Abstract Neural Networks (ANNs), which can be used to soundly overapproximate DNNs while using fewer nodes. An ANN is like a DNN except weight matrices are replaced by values in a given abstract domain. We present a framework parameterized by the abstract domain and activation functions used in the DNN that can be used to construct a corresponding ANN. We present necessary and sufficient conditions on the DNN activation functions for the constructed ANN to soundly over-approximate the given DNN. Prior work on DNN abstraction was restricted to the interval domain and ReLU activation function. Our framework can be instantiated with other abstract domains such as octagons and polyhedra, as well as other activation functions such as Leaky ReLU, Sigmoid, and Hyperbolic Tangent.

中文翻译:

抽象神经网络

深度神经网络 (DNN) 正迅速应用于无人机和飞机控制等安全关键领域,推动技术验证其行为的安全性。不幸的是,DNN 验证是 NP 难的,当前的算法随着 DNN 中节点的数量呈指数增长。本文介绍了抽象神经网络 (ANN) 的概念,该概念可用于在使用较少节点的情况下完全过度逼近 DNN。ANN 类似于 DNN,只是权重矩阵被给定抽象域中的值替换。我们提出了一个由 DNN 中使用的抽象域和激活函数参数化的框架,可用于构建相应的 ANN。我们提出了构建 ANN 的 DNN 激活函数的充分必要条件,以完全过度逼近给定的 DNN。之前关于 DNN 抽象的工作仅限于区间域和 ReLU 激活函数。我们的框架可以用其他抽象域(例如八边形和多面体)以及其他激活函数(例如 Leaky ReLU、Sigmoid 和 Hyperbolic Tangent)实例化。
更新日期:2020-09-15
down
wechat
bug