当前位置: X-MOL 学术Soil › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Game theory interpretation of digital soil mapping convolutional neural networks
Soil ( IF 5.8 ) Pub Date : 2020-08-18 , DOI: 10.5194/soil-6-389-2020
José Padarian , Alex B. McBratney , Budiman Minasny

The use of complex models such as deep neural networks has yielded large improvements in predictive tasks in many fields including digital soil mapping. One of the concerns about using these models is that they are perceived as black boxes with low interpretability. In this paper we introduce the use of game theory, specifically Shapley additive explanations (SHAP) values, in order to interpret a digital soil mapping model. SHAP values represent the contribution of a covariate to the final model predictions. We applied this method to a multi-task convolutional neural network trained to predict soil organic carbon in Chile. The results show the contribution of each covariate to the model predictions in three different contexts: (a) at a local level, showing the contribution of the various covariates for a single prediction; (b) a global understanding of the covariate contribution; and (c) a spatial interpretation of their contributions. The latter constitutes a novel application of SHAP values and also the first detailed analysis of a model in a spatial context. The analysis of a SOC (soil organic carbon) model in Chile corroborated that the model is capturing sensible relationships between SOC and rainfall, temperature, elevation, slope, and topographic wetness index. The results agree with commonly reported relationships, highlighting environmental thresholds that coincide with significant areas within the study area. This contribution addresses the limitations of the current interpretation of models in digital soil mapping, especially in a spatial context. We believe that SHAP values are a valuable tool that should be included within the DSM (digital soil mapping) framework, since they address the important concerns regarding the interpretability of more complex models. The model interpretation is a crucial step that could lead to generating new knowledge to improve our understanding of soils.

中文翻译:

数字土壤映射卷积神经网络的博弈论解释

诸如深层神经网络之类的复杂模型的使用在许多领域(包括数字土壤制图)的预测任务中取得了很大的进步。使用这些模型的担忧之一是,它们被视为具有低可解释性的黑匣子。在本文中,我们介绍了博弈论的使用,特别是Shapley加性解释(SHAP)值,以便解释数字土壤制图模型。SHAP值表示协变量对最终模型预测的贡献。我们将此方法应用于经过训练可预测智利土壤有机碳的多任务卷积神经网络。结果显示了在三个不同上下文中每个协变量对模型预测的贡献:(a)在局部水平上,显示了单个协变量的各个协变量的贡献;(b)对协变量贡献的全局理解;(c)对其贡献的空间解释。后者构成了SHAP值的新颖应用,而且还构成了在空间上下文中对模型的首次详细分析。智利对SOC(土壤有机碳)模型的分析证实了该模型正在捕捉SOC与降雨,温度,海拔,坡度和地形湿度指数之间的合理关系。结果与通常报道的关系相吻合,突出了与研究区域内重要区域一致的环境阈值。该贡献解决了数字土壤制图中当前模型解释的局限性,尤其是在空间环境中。我们认为,SHAP值是应该包含在DSM(数字土壤制图)框架中的有价值的工具,因为它们解决了有关更复杂模型的可解释性的重要问题。模型解释是至关重要的一步,可能会导致产生新知识以增进我们对土壤的理解。
更新日期:2020-08-20
down
wechat
bug