当前位置: X-MOL 学术Philosophies › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
The P–T Probability Framework for Semantic Communication, Falsification, Confirmation, and Bayesian Reasoning
Philosophies Pub Date : 2020-10-02 , DOI: 10.3390/philosophies5040025
Chenguang Lu

Many researchers want to unify probability and logic by defining logical probability or probabilistic logic reasonably. This paper tries to unify statistics and logic so that we can use both statistical probability and logical probability at the same time. For this purpose, this paper proposes the P–T probability framework, which is assembled with Shannon’s statistical probability framework for communication, Kolmogorov’s probability axioms for logical probability, and Zadeh’s membership functions used as truth functions. Two kinds of probabilities are connected by an extended Bayes’ theorem, with which we can convert a likelihood function and a truth function from one to another. Hence, we can train truth functions (in logic) by sampling distributions (in statistics). This probability framework was developed in the author’s long-term studies on semantic information, statistical learning, and color vision. This paper first proposes the P–T probability framework and explains different probabilities in it by its applications to semantic information theory. Then, this framework and the semantic information methods are applied to statistical learning, statistical mechanics, hypothesis evaluation (including falsification), confirmation, and Bayesian reasoning. Theoretical applications illustrate the reasonability and practicability of this framework. This framework is helpful for interpretable AI. To interpret neural networks, we need further study.

中文翻译:

语义交流,伪造,确认和贝叶斯推理的P–T概率框架

许多研究人员希望通过合理定义逻辑概率或概率逻辑来统一概率和逻辑。本文试图统一统计和逻辑,以便我们可以同时使用统计概率和逻辑概率。为此,本文提出了一种P–T概率框架,该框架与Shannon的通信统计概率框架,Kolmogorov的逻辑概率概率公理以及Zadeh的隶属度函数用作真函数相结合。扩展的贝叶斯定理将两种概率联系在一起,利用它们我们可以将似然函数和真函数相互转换。因此,我们可以通过采样分布(统计)来训练真函数(逻辑)。这个概率框架是在作者对语义信息,统计学习和色彩视觉的长期研究中开发的。本文首先提出了P–T概率框架,并通过将其应用于语义信息理论来解释其中的不同概率。然后,将该框架和语义信息方法应用于统计学习,统计力学,假设评估(包括伪造),确认和贝叶斯推理。理论上的应用说明了该框架的合理性和实用性。该框架对于可解释的AI很有帮助。解释神经网络,我们需要进一步研究。本文首先提出了P–T概率框架,并通过将其应用于语义信息理论来解释其中的不同概率。然后,将该框架和语义信息方法应用于统计学习,统计力学,假设评估(包括伪造),确认和贝叶斯推理。理论上的应用说明了该框架的合理性和实用性。该框架对于可解释的AI很有帮助。解释神经网络,我们需要进一步研究。本文首先提出了P–T概率框架,并通过将其应用于语义信息理论来解释其中的不同概率。然后,将该框架和语义信息方法应用于统计学习,统计力学,假设评估(包括伪造),确认和贝叶斯推理。理论上的应用说明了该框架的合理性和实用性。该框架对于可解释的AI很有帮助。解释神经网络,我们需要进一步研究。理论上的应用说明了该框架的合理性和实用性。该框架对于可解释的AI很有帮助。解释神经网络,我们需要进一步研究。理论上的应用说明了该框架的合理性和实用性。该框架对于可解释的AI很有帮助。解释神经网络,我们需要进一步研究。
更新日期:2020-10-02
down
wechat
bug