当前位置: X-MOL 学术Int. J. Approx. Reason. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Entropy and Monotonicity in Artificial Intelligence
International Journal of Approximate Reasoning ( IF 3.9 ) Pub Date : 2020-09-01 , DOI: 10.1016/j.ijar.2020.04.008
Bernadette Bouchon-Meunier , Christophe Marsala

Abstract Entropies and measures of information are extensively used in several domains and applications in Artificial Intelligence. Among the original quantities from Information theory and Probability theory, a lot of extensions have been introduced to take into account fuzzy sets, intuitionistic fuzzy sets and other representation models of uncertainty and imprecision. In this paper, we propose a study of the common property of monotonicity of such measures with regard to a refinement of information, showing that the main differences between these quantities come from the diversity of orders defining such a refinement. Our aim is to propose a clarification of the concept of refinement of information and the underlying monotonicity, and to illustrate this paradigm by the utilisation of such measures in Artificial Intelligence.

中文翻译:

人工智能中的熵和单调性

摘要 信息的熵和度量被广泛用于人工智能的多个领域和应用。在信息论和概率论的原始量中,引入了很多扩展以考虑模糊集、直觉模糊集和其他不确定性和不精确性的表示模型。在本文中,我们建议研究此类度量在信息细化方面的单调性的共同属性,表明这些量之间的主要差异来自定义此类细化的阶数的多样性。我们的目标是提出对信息细化和潜在单调性概念的澄清,并通过在人工智能中使用此类措施来说明这种范式。
更新日期:2020-09-01
down
wechat
bug