当前位置: X-MOL 学术arXiv.cs.AI › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A Theory on AI Uncertainty Based on Rademacher Complexity and Shannon Entropy
arXiv - CS - Artificial Intelligence Pub Date : 2020-11-19 , DOI: arxiv-2011.11484
Mingyong Zhou

In this paper, we present a theoretical discussion on AI deep learning neural network uncertainty investigation based on the classical Rademacher complexity and Shannon entropy. First it is shown that the classical Rademacher complexity and Shannon entropy is closely related by quantity by definitions. Secondly based on the Shannon mathematical theory on communication [3], we derive a criteria to ensure AI correctness and accuracy in classifications problems. Last but not the least based on Peter Barlette's work, we show both a relaxing condition and a stricter condition to guarantee the correctness and accuracy in AI classification . By elucidating in this paper criteria condition in terms of Shannon entropy based on Shannon theory, it becomes easier to explore other criteria in terms of other complexity measurements such as Vapnik-Cheronenkis, Gaussian complexity by taking advantage of the relations studies results in other references. A close to 0.5 criteria on Shannon entropy is derived in this paper for the theoretical investigation of AI accuracy and correctness for classification problems.

中文翻译:

基于Rademacher复杂度和Shannon熵的AI不确定度理论。

在本文中,我们基于经典Rademacher复杂度和Shannon熵提出了关于AI深度学习神经网络不确定性调查的理论讨论。首先,通过定义,经典的Rademacher复杂度和Shannon熵在数量上密切相关。其次,基于关于通信的香农数学理论[3],我们推导了确保分类问题中AI正确性和准确性的标准。最后但并非最不重要的一点是,根据彼得·巴莱特(Peter Barlette)的研究,我们展示了放松的条件和严格的条件,以保证AI分类的正确性和准确性。通过基于香农理论阐明香农熵的标准条件,通过利用其他参考文献中的关系研究结果,可以更轻松地探索有关其他复杂性度量的其他标准,例如Vapnik-Cheronenkis,高斯复杂性。本文针对香农熵提出了接近0.5的判据,用于对AI精度和分类问题的正确性进行理论研究。
更新日期:2020-11-25
down
wechat
bug