当前位置: X-MOL 学术Int. J. Geograph. Inform. Sci. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A machine learning approach for predicting computational intensity and domain decomposition in parallel geoprocessing
International Journal of Geographical Information Science ( IF 4.3 ) Pub Date : 2020-02-20 , DOI: 10.1080/13658816.2020.1730850
Peng Yue 1, 2, 3, 4 , Fan Gao 1 , Boyi Shangguan 1 , Zheren Yan 1
Affiliation  

ABSTRACT High performance computing is required for fast geoprocessing of geospatial big data. Using spatial domains to represent computational intensity (CIT) and domain decomposition for parallelism are prominent strategies when designing parallel geoprocessing applications. Traditional domain decomposition is limited in evaluating the computational intensity, which often results in load imbalance and poor parallel performance. From the data science perspective, machine learning from Artificial Intelligence (AI) shows promise for better CIT evaluation. This paper proposes a machine learning approach for predicting computational intensity, followed by an optimized domain decomposition, which divides the spatial domain into balanced subdivisions based on the predicted CIT to achieve better parallel performance. The approach provides a reference framework on how various machine learning methods including feature selection and model training can be used in predicting computational intensity and optimizing parallel geoprocessing against different cases. Some comparative experiments between the approach and traditional methods were performed using the two cases, DEM generation from point clouds and spatial intersection on vector data. The results not only demonstrate the advantage of the approach, but also provide hints on how traditional GIS computation can be improved by the AI machine learning.

中文翻译:

一种预测并行地理处理中计算强度和域分解的机器学习方法

摘要 地理空间大数据的快速地理处理需要高性能计算。在设计并行地理处理应用程序时,使用空间域来表示计算强度 (CIT) 和用于并行的域分解是重要的策略。传统的域分解在评估计算强度方面受到限制,这往往导致负载不平衡和并行性能不佳。从数据科学的角度来看,人工智能 (AI) 的机器学习显示出更好的 CIT 评估前景。本文提出了一种预测计算强度的机器学习方法,然后进行优化的域分解,根据预测的 CIT 将空间域划分为平衡的细分,以实现更好的并行性能。该方法提供了一个参考框架,说明如何使用包括特征选择和模型训练在内的各种机器学习方法来预测计算强度和针对不同情况优化并行地理处理。使用点云生成DEM和矢量数据的空间交集两种情况,对该方法与传统方法进行了一些比较实验。结果不仅证明了该方法的优势,而且还提供了有关如何通过 AI 机器学习改进传统 GIS 计算的提示。使用点云生成DEM和矢量数据的空间交集两种情况,对该方法与传统方法进行了一些比较实验。结果不仅证明了该方法的优势,而且还提供了有关如何通过 AI 机器学习改进传统 GIS 计算的提示。使用点云生成DEM和矢量数据的空间交集两种情况,对该方法与传统方法进行了一些比较实验。结果不仅证明了该方法的优势,而且还提供了有关如何通过 AI 机器学习改进传统 GIS 计算的提示。
更新日期:2020-02-20
down
wechat
bug