Skip to content
BY 4.0 license Open Access Published by De Gruyter Open Access November 6, 2020

Slope stability evaluation using backpropagation neural networks and multivariate adaptive regression splines

  • Zhihao Liao and Zhiwei Liao EMAIL logo
From the journal Open Geosciences

Abstract

Slope stability assessment is a critical concern in construction projects. This study explores the use of multivariate adaptive regression splines (MARS) to capture the intrinsic nonlinear and multidimensional relationship among the parameters that are associated with the evaluation of slope stability. A comparative study of machine learning solutions for slope stability assessment that relied on backpropagation neural network (BPNN) and MARS was conducted. One data set with actual slope collapse events was utilized for model development and to compare the performance of BPNN and MARS. Research results suggest that BPNN and MARS models can model the relationship between the safety factor and the slope parameters. Also, the MARS model has the advantages of computational efficiency and easy interpretation.

1 Introduction

Landslide is a common type of geological disaster, which always impose heavy social and economic losses [1,2,3]. These hazards are often caused by various uncertainties, including external factors (e.g., rainfall, earthquake, excavation) and the hydrogeological conditions of the slope [4,5,6]. Thus, analyzing the stability of the slope and preventing or mitigating the damage became an urgent issue for engineers and researchers. By convention, the safety factor is considered as an important index for slope stability analysis and a choice of treatment designs. The limit equilibrium method (LEM) and numerical calculation method based on the theory of elasticity and plasticity are widely used to calculate the safety factor [7,8,9]. However, the accuracy of the LEM is commonly affected to some extent because of the assumption on slip surface and interslice forces. Also, the numerical calculation method based on the theory of elasticity and plasticity requires the selection of a well-fitted constitutive model that can reflect the mechanical behavior of soil and rocks in slopes accurately, which is a difficult task for engineers and researchers [10,11,12,13]. Moreover, the interactions among the factors that affect slope stability are complex and multifactorial, thereby resulting in the difficult evaluation of the real slope stability based on safety factors.

In recent years, for the intrinsic capability of mining valuable information hidden in records of real slope cases, machine learning algorithms, such as artificial neural network (ANN)-based [14,15,16,17,18,19], discriminant analysis-based [20,21,22], and decision tree-based approaches [23,24,25], have been applied to the evaluation of slope stability gradually. According to the potential failure mechanism, these methods evaluate slope stability based on the characteristics of the parameters, such as geotechnical parameters, slope geometry, and water condition, and the application of these machine learning algorithms can obtain reasonable results that can be used for the evaluation of slope stability. Furthermore, even a small increment of the prediction accuracy could have a significant influence on the evaluation of slope stability [26]. Therefore, exploring high-performance-based models, which are used to predict slope stability accurately, is necessary.

This study explores the use of multivariate adaptive regression splines (MARS) [27] to capture the intrinsic nonlinear and multidimensional relationship associated with the evaluation of slope stability. A comparative study of machine learning solutions for slope stability assessment that relied on backpropagation neural network (BPNN) and MARS was conducted. One data set with actual slope collapse events was utilized for model development and to compare the performance of BPNN and MARS.

2 BPNN

ANN is a computational model based on the structure and functions of biological neural networks that can be used to model the complex relationships between inputs and outputs or as pattern classifiers [28]. The ANN structure consists of one or more layers of interconnected neurons or nodes. Each link that connects each neuron has an associated weight. They can learn some target values (desired output) from a set of selected input data through a computing network system under supervised and self-adjusted or unsupervised learning algorithms [29]. The learning process is achieved by adjusting the weights in the network until a particular input leads to a specific target output. The predictive power of the trained neural networks can be tested by comparing the predicted value using a target.

Various algorithms for ANN have been proposed in previous literature, among which the BPNN model is the most widely used [30,31]. The network has an enhanced functionality of highly nonlinear correspondent relations that analyze input and output and fine feasibility. The BPNNs generally consist of one input layer, one or more hidden layers, and one output layer. The layers are completely connected, and each interconnection is assigned with an associative weight (Figure 1). BPNN training consists of two phases of computation: a forward pass and a backward pass. The steps in the BPNN algorithm are described as follows: first, the network weights are initialized, and the input data are presented forward from the input to the output layer, thereby producing actual output. Second, the output is compared with the target values, and an error is determined. Finally, the errors are propagated backward from the output layer to the previous layers, and the connection weights are updated to minimize the errors.

Figure 1 BPNN architecture (adapted from ref. [33]).
Figure 1

BPNN architecture (adapted from ref. [33]).

Although the number of hidden neurons is typically determined through a trial-and-error process, the number of input neurons is the same as the number of the selected factors. Normally, the smallest number of neurons that yields satisfactory results (judged by the network performance in terms of the coefficient of determination R2 of the testing data set) is selected. In the present study, a MATLAB-based BPNN with Levenberg–Marquardt (LM) algorithm [32] was adopted for neural network modeling.

3 MARS

MARS is a form of regression analysis introduced by ref. [27]. This algorithm generates flexible regression models, where the nonlinear relationship between a group of input variables and their dependent outputs is approximated. MARS is a nonparametric statistical method based on a divide-and-conquer strategy, in which the training data sets are partitioned into separate piecewise linear segments (splines) of differing gradients. No restrictive assumption is made regarding the functional relationship between the dependent and independent variables in MARS. Instead, the MARS model constructs this relation from a set of coefficients and basis functions (BFs) that are entirely “driven” from the data [34].

The MARS approximation function fˆ(X), which is a linear combination of BFs and their interactions, is expressed as follows:

(1)fˆ(X)=a0+m=1MamBm(X),

where fˆ(X) is the MARS predictor, X = (x1, x2,...,xp) is the vector of the input variables, a0 is a constant coefficient, am is the coefficient of the mth term, which is obtained by the least-squares method, and Bm(X) is the mth BF. The MARS approximation function can be a spline function or interaction BFs produced by multiplying an existing term with a truncated linear function that involves a new/different variable. Bm(X) can be a single spline BF (SBF) and the product of two or more SBFs. Thus, MARS can approximate highly nonlinear problems.

The MARS model has executed in two phases of computation: a forward phase and a backward phase. First, the MARS model is described by only using one input variable. Candidate knots are placed at random positions within the range of each predictor to define a pair of basic functions. At each step, the knot and its corresponding pair of basic functions are fitted to yield the maximum reduction in sum-of-squares residual error. New BFs are added until the maximum complexity or threshold value is achieved. The forward phase selection of the BF results in an extremely complex and over-fitted model.

MARS modeling is a data-driven process. To construct the model in equation (1), the forward phase is performed on the training data starting initially with only the intercept a0. At each subsequent step, the basis pair that produces the maximum reduction in the training error is added. Considering a current model with M BFs, the next model would be updated after two BFs are added as follows:

(2)fˆ(X)=a0+m=1MamBm(X)+aM+1Bl(X)max(0,xjt)+aM+2Bl(X)max(0,txj).

Even if this resulted function fits the training data well, it has a poor predictive capacity for the new data set. To enhance the prediction capacity of the MARS model, the backward phase considers the residual error and the model complexity, and the redundant BFs that have the least contributions are deleted. Model subsets are compared using the less computationally expensive method of generalized cross-validation (GCV). The GCV is the mean-squared residual error divided by a penalty that is dependent on model complexity. For a training data set with N points, GCV is calculated as follows:

(3)GCV=1Ni=1N[yifˆ(Xi)]21M+d×(M1)/2N2,

where M is the number of BFs, yi is the true value at Xi, fˆ(Xi) is the predictive value at Xi, and d is a penalty for each BF included in the developed submodel. The GCV penalizes not only the number of BFs but also the number of knots.

4 History data of slopes

To explore the application of MARS in the evaluation of slope stability and compare the result with BPNN, one history data set of the slope is employed in this study. The data set contains 153 history cases that are collected from previous research [35,36,37,38,39]. All the 153 data samples used for further analyses are shown in the Appendix. Based on a previous study, six influencing factors are used in this data set to present the character of the slope. The factors that influence slope stability, including unit weight (γ), cohesion (c), internal friction angle (α), slope angle (β), slope height (H), and pore pressure ratio (Ru), are used to characterize the earth slope.

Table 1 shows the details of the statistical descriptions of the data set used in this study. For better recognition of the relationships between safety factors and other parameters, the scatterplots of all influencing factors with safety factors are plotted in Figure 2. The corresponding relationships between influencing factors and the safety factor were not obvious.

Table 1

Detail statistical description of the data set

FactorsMaxMinAverageStdMedian
Bulk weight (γ/kN/m3)31.3012.0021.764.1320.96
Cohesion (c/kPa)300.000.0034.1245.8219.96
Friction angle (α/°)45.000.0028.7310.5830.24
Slope angle (β/°)59.0016.0036.1010.2235.00
Slope height (H/m)511.003.60104.19132.6850.00
Pore pressure ratio (Ru)45.000.000.483.450.25
Figure 2 Scatterplots of all influencing factors with safety factors.
Figure 2

Scatterplots of all influencing factors with safety factors.

5 Experimental results

In this study, 80% of the data is used as training data; the rest is used to test the machine learning model. To reduce the subjectivity in the selection of training and testing data, every group of data sets is selected randomly.

In the case of BPNN, the sigmoid function is used as the action function. The number of neurons in the hidden layer is selected based on trial. In this study, the number of neurons, which ensures BPNN has excellent performance, is six. A total of 122 data, which are employed to construct the machine learning model, are included in data set 1. Moreover, six parameters should be inputted, and one output result should be generated.

To construct a MARS model, the data are packed the same way as the BPNN model. The number of basic functions, as well as the parameters associated with each one (product degree and knot locations), are automatically determined by the data. The prediction of safety factors using the MARS model with second order interaction adopted 25 BFs of linear spline function. An open MARS source code from ref. [40] is adapted to perform the analyses presented in this study.

5.1 Prediction of the safety factor

Figure 3 shows the comparison of predicted safety factor and target value in slopes based on the BPNN and MARS model. BPNN and MARS models can model the relationship between the safety factor and the slope parameters. In this data set, the correlation coefficient R2 in the BPNN model is 0.8931, and the value in the MARS model is 0.8629. However, the training time of the BPNN model is 95.61 s, whereas that of the MARS model is 8.63 s. The BPNN model will spend more time to train the model and obtain a predicted safety factor, and the precision of which is slightly bigger than that of the MARS model.

Figure 3 Comparison of predicted safety factor and target value in slopes.
Figure 3

Comparison of predicted safety factor and target value in slopes.

5.2 Parameter relative importance

After the MARS model is built; all the involved BFs are grouped based on the analysis of variance (ANOVA) decomposition [27]. This procedure is used to assess the contributions of the six influencing factors of slope stability. Table 2 presents the ANOVA decomposition, which is obtained by using the MARS model. The last column provides the particular input variable associated with the ANOVA function. Based on the information, the relative importance of the input influence factors of slope stability for the MARS model is evaluated based on the increase in the GCV and STD values. All the conditioning factors have contributed to the model; however, the slope angle is the most important one.

Table 2

ANOVA decomposition for the MARS model

Func.STDGCVBasisParams.Variable (s)
10.52.20812.5Slope angle (β)
20.3290.82912.5Slope height (H)
30.0650.07112.5Pore pressure ratio (Ru)
40.0810.08512.5Bulk weight (γ), cohesion (kPa)
50.2160.31912.5Bulk weight (γ), slope height (m)
60.1810.26212.5Cohesion (c), friction angle (α)
70.2360.46137.5Cohesion (c), slope angle (β)
80.1910.342512.5Cohesion (c), slope height (H)
90.4080.872512.5Friction angle (α), pore pressure ratio (Ru)
100.1870.21125Slope angle (β), slope height (H)
110.3951.427410Slope angle (β), pore pressure ratio (Ru)

In the MARS model, the GCV and STD values of the slope angle are taken as the reference value. Figure 4 shows the comparison of the relative importance of the input variables based on the BPNN and MARS models. Compared with other parameters, the safety factor is more sensitive to slope height in the BPNN model. In the MARS model, the slope angle is the most sensitive parameter. The results indicate that the relative importance of the input variables obtained by different machine learning models may vary. Nevertheless, all of these variables can predict the output value effectively.

Figure 4 Relative importance of the input variables based on the BPNN and MARS models.
Figure 4

Relative importance of the input variables based on the BPNN and MARS models.

5.3 Interpreted MARS model

Table 3 lists the BFs and their corresponding equation for the developed MARS model. Table 3 shows that interactions have occurred among BFs (25 of the 66 BFs are interaction terms). The presence of interactions suggests that the built MARS model is not simply additive, and interactions play a significant role in building an accurate model for capacity energy predictions. Without making any specific assumption about the underlying functional relationship between the input variables and the dependent response, MARS can capture the nonlinear and complex relationships between energy capacity and a multitude of initial geotechnical parameters with interactions among one another. The equation of MARS energy capacity model is given by

y=1.24+0.017×BF1+0.0545×BF20.62×BF30.00017×BF4+5.18×BF5+0.0015×BF60.16×BF7+0.23×BF80.33×BF96.87e05×BF106.45e05×BF110.00043×BF120.30×BF13+0.23×BF140.0010×BF15+3.83×BF16+0.0021×BF17+0.23×BF18+0.0011×BF190.00024×BF20+0.00088×BF210.107×BF220.0027×BF230.0012×BF240.0032×BF25

In this section, the results of slope stability assessment based on BPNN and MARS are compared. Compared with BPNN, the MARS model is easy to be interpreted. Although the precision of the MARS model is slightly lower than that of the BPNN model, it will spend less time to train the model and obtain a predicted safety factor. In the monitoring and early warning of slope stability, the time factor is more important for people. Also, the MARS model can capture the intrinsic nonlinear and multidimensional relationship between the safety factor and the slope parameters.

Table 3

Expressions of BFs for the developed MARS model

BFEquationBFEquation
BF1max(0, 50 −x5)BF14BF2 * max(0, 0.25 −x6)
BF2max(0, 45.3 −x4)BF15max(0, 45.02 −x2) * max(0, x5 −37)
BF3max(0, x4 −45.3) * max(0, 0.25 −x6)BF16BF5 * max(0, x3 −36)
BF4max(0, x5 −50) * max(0, x4 −40.02)BF17max(0, 45.02 −x2) * max(0, x4 −40)
BF5max(0, x6 −0.45)BF18max(0, 0.45 −x6) * max(0, 26.6 −x3)
BF6max(0, 45.02 −x2) * max(0, x1 −20.41)BF19max(0, 45.02 −x2) * max(0, x5 −50)
BF7max(0, 0.45 −x6) * max(0, 50 −x4)BF20max(0, 45.02 −x2) * max(0, 50 −x5)
BF8max(0, 0.45 −x6) * max(0, x3 −22.01)BF21max(0, x3 −9.99) * max(0, 24.9 −x2)
BF9max(0, 0.45 −x6) * max(0, 22.01 −x3)BF22max(0, x3 −9.99) * max(0, 0.25 −x6)
BF10max(0, x2 −45.02) * max(0, x5 −120)BF23BF1 * max(0, x1 −16.47)
BF11max(0, x2 −45.02) * max(0, 120 −x5)BF24max(0, 45.02 −x2) * max(0, x4 −22)
BF12BF2 * max(0, x5 −115)BF25max(0, 45.02 −x2) * max(0, 22 −x4)
BF13BF2 * max(0, x6 −0.25)

6 Conclusions

This study explores the use of MARS to capture the intrinsic nonlinear and multidimensional relationship between the parameters, which is associated with the evaluation of slope stability. A comparative study of machine learning solutions for slope stability assessment that relied on BPNN and MARS was conducted. One data set with actual slope collapse events was utilized for model development and to compare the performance of BPNN and MARS. The results suggest that BPNN and MARS models can demonstrate the relationship between the safety factor and the slope parameters. Also, the MARS model has the advantage of computational efficiency and easy interpretation. The relative importance of the input variables obtained by different machine learning models may vary. Nevertheless, all of these variables can predict the output value effectively.

Acknowledgments

The authors thank Managing Editor Dr Jan Barabach and four reviewers for constructive comments and suggestions, which help to improve the article. This study was supported by the Science and Technology Research Program of Chongqing Municipal Education Commission (Grant No. KJQN201800115).

Appendix: dataset used in this study

Table A1

Dataset used in this study (from ref. [3539,41])

No.Unit weight/γ (kN/m3)Cohesion/c (kPa)Friction angle/α (°)Slope angle/β (°)Slope height/H (m)Pore pressure ratio/RuSafety factor /FS
118.6826.3415358.2301.11
216.511.490303.6601
318.8414.36252030.501.875
418.8457.46202030.502.045
528.4429.42353510001.78
628.4439.23383510001.99
720.616.2826.5304001.25
814.8017205001.13
91411.9726308801.02
1025120455312001.3
1126150.05455020001.2
1218.525030601.09
1318.512030600.78
1422.41035301002
1521.41030.34302001.7
16222036455001.02
1722036455000.89
181203035401.46
191203045800.8
201203035401.44
211203045801.86
2223.470323721401.08
231670204011501.1
2420.4124.9132210.670.351.4
2519.6311.97202212.190.411.35
2621.828.62322812.80.491.03
2720.4133.52111645.720.21.28
2818.8415.32302510.670.381.63
2918.84020207.620.451.05
3021.4302020610.51.03
3119.0611.712835210.111.09
3218.8414.36252030.50.451.11
3321.516.94303176.810.381.01
341411.972630880.450.625
35182430.1545200.121.12
3623020201000.31.2
3722.41004545150.251.8
3822.4103545100.40.9
3920203645500.250.96
4020203645500.50.83
412003645500.250.79
422003645500.50.67
43220403380.351.45
44240403380.31.58
4520024.52080.351.37
46185302080.32.05
4726.491503345730.151.23
4826.715033501300.251.8
4926.8915033521200.251.8
5026.5730038.745.3800.151.18
5126.7830038.7541550.251.2
5226.8120035581380.251.2
5326.435026.64092.20.151.25
5426.75026.6501700.251.25
5526.86028.8591080.251.25
5622.4103545100.40.9
5720203645500.50.83
582003645500.250.79
592003645500.50.67
60220403380.351.45
61240403380.31.58
6220024.52080.351.37
63185302080.32.05
64274035434200.251.15
65275040424070.251.44
66273535423590.251.27
672737.53537.83200.251.24
6827323342.63010.251.16
6927323342.42890.251.3
7027.31431411100.251.249
7127.331.529.7411350.251.245
7227.316.8285090.50.251.252
7327.3263150920.251.246
7427.31039415110.251.434
7527.31039404700.251.418
76254635474430.251.28
77254635444350.251.37
78254635464320.251.23
792615045302000.251.2
8018.52503060.251.09
8118.51203060.250.78
8222.4103530100.252
8321.41030.3430200.251.7
8422203645500.251.02
852203645500.250.89
86120304540.251.46
87120304580.250.8
88120304540.251.44
8931.3683749200.50.251.2
9020203645500.250.96
9127403547.12920.251.15
92254635502840.251.34
9331.36837463660.251.2
9425463644.52990.251.55
9527.31039404800.251.45
96254635463930.251.31
97254840493300.251.49
9831.368.637473050.251.2
9925553645.52990.251.52
10031.36837472130.251.2
10128.4150.0545532000.52.31
10218.6626.4114.9934.988.201.11
10328.429.4135.0134.9810001.78
10425.96150.054549.9820001.2
10518.4625.06030601.09
10621.3610.0530.33302001.7
10715.9970.0719.9840.0211501.11
10820.3924.9113.012210.60.351.4
10919.61219.982212.20.411.35
11021.788.553227.9812.80.491.03
11120.3933.4610.9816.0145.80.21.28
11219.0311.727.9934.98210.111.09
11317.984.9530.0219.9880.32.05
11420.9619.9640.0140.021201.84
11520.9634.9627.9940.02120.51.43
11619.9710.0528.9834.0360.31.34
11718.7730.019.9925.02500.11.4
11818.7730.0119.9830500.11.46
11918.7725.0619.9830500.21.21
12020.5616.2126.51304001.25
12116.4711.550303.601
12218.814.425.0219.9830.601.88
12318.857.4719.9819.9830.602.04
12428.439.1637.9834.9810001.99
12513.971226.01308801.02
12624.96120.04455312001.3
12718.4612030600.78
12822.3810.0535.01301002
12921.9819.9636455001.02
13018.815.3130.0225.0210.60.381.63
13118.814.425.0219.9830.60.451.11
13221.476.930.0231.0176.80.381.01
13313.971226.0130880.450.63
13417.9824.0130.1545200.121.12
13522.3899.934545150.251.8
13622.3810.0535.0145100.40.9
13719.9719.963645500.250.96
13819.9719.963645500.50.83
13920.9645.0225.0249.03120.31.53
14020.9630.0135.0140.02120.41.49
14119.9740.0630.0230150.31.84
14217.9845.0225.0225.02140.32.09
14318.9730.0135.0134.98110.22
14419.9740.0640.0140.02100.22.31
14518.8324.7621.2929.2370.51.07
14618.8310.3521.2934.03370.31.29
14718.7725.069.9925.02500.21.18
14818.7719.969.9925.02500.30.97
14919.0810.059.9925.02500.40.65
15018.7719.9619.9830500.31
15119.0810.0519.9830500.40.65
15221.9819.9622.0119.9818001.12
15321.9819.9622.0119.981800.10.99

References

[1] Peruccacci S, Brunetti MT, Gariano SL, Melillo M, Rossi M, Guzzetti F. Rainfall thresholds for possible landslide occurrence in Italy. Geomorphology. 2017;290:39–57. 10.1016/j.geomorph.2017.03.031.Search in Google Scholar

[2] Wu DA, Bai JB, Wang XY, Yan S, Wu SX. Numerical study of failure mechanisms and control techniques for a gob-side yield pillar in the Sijiazhuang Coal Mine, China. Rock Mech Rock Eng. 2019a;52(4):1231–45. 10.1007/s00603-018-1654-3.Search in Google Scholar

[3] Wu DA, Bai JB, Wang XY, Zhu ZJ, Yan S. Field investigation of fractures evolution in overlying strata caused by extraction of the Jurassic and Carboniferous coal seams and its application: case study. Int J Coal Geol. 2019b;208:12–23. 10.1016/j.coal.2019.04.002.Search in Google Scholar

[4] Massey C, Pasqua FD, Holden C, Kaiser A, Richards L, Wartman J, et al. Rock slope response to strong earthquake shaking. Landslides. 2016;14(1):1–20. 10.1007/s10346-016-0684-8.Search in Google Scholar

[5] Sun Y, Jiang Q, Yin T, Zhou C. A back-analysis method using an intelligent multi-objective optimization for predicting slope deformation induced by excavation. Eng Geol. 2018;239:214–28. 10.1016/j.enggeo.2018.03.019.Search in Google Scholar

[6] Li Q, Wang YM, Zhang KB, Yu H, Tao ZY. Field investigation and numerical study of a siltstone slope instability induced by excavation and rainfall. Landslides. 2020;17(6):1485–99. 10.1007/s10346-020-01396-5.Search in Google Scholar

[7] Zienkiewicz OC, Humpheson C, Lewis RW. Associated and non-associated visco-plasticity and plasticity in soil mechanics. Geotechnique. 1975;25(4):671–89. 10.1680/geot.1975.25.4.671.Search in Google Scholar

[8] Griffiths DV, Lane PA. Slope stability analysis by finite elements. Geotechnique. 1999;49(3):387–403. 10.1680/geot.1999.49.3.387.Search in Google Scholar

[9] Xu W, Wang S, Bilal M. LEM-DEM coupling for slope stability analysis. Sci China Technol Sci. 2020;63(2):329–40. 10.1007/s11431-018-9387-2.Search in Google Scholar

[10] Tschuchnigg F, Schweiger HF, Sloan SW, Lyamin AV, Raissakis I. Comparison of finite-element limit analysis and strength reduction techniques. Geotechnique. 2015;65(4):249–57. 10.1680/geot.14.P.022.Search in Google Scholar

[11] Reale C, Xue J, Gavin K. System reliability of slopes using multimodal optimisation. Geotechnique. 2016;66(5):413–23. 10.1680/jgeot.15.P.142.Search in Google Scholar

[12] Song DQ, Chen Z, Chao H, Ke YT, Nie W. Numerical study on seismic response of a rock slope with discontinuities based on the time-frequency joint analysis method. Soil Dyn Earthq Eng. 2020;133:106112. 10.1016/j.soildyn.2020.106112.Search in Google Scholar

[13] Jamalinia E, Vardon PJ, Steele-Dunne SC. The impact of evaporation induced cracks and precipitation on temporal slope stability. Comput Geotech. 2020;122:103506. 10.1016/j.compgeo.2020.103506.Search in Google Scholar

[14] Lee S, Ryu JH, Won JS, Park HJ. Determination and application of the weights for landslide susceptibility mapping using an artificial neural network. Eng Geol. 2004;71(3–4):289–302. 10.1016/S0013-7952(03)00142-X.Search in Google Scholar

[15] Wang HB, Xu WY, Xu RC. Slope stability evaluation using back propagation neural networks. Eng Geol. 2005;80(3–4):302–15. 10.1016/j.enggeo.2005.06.005.Search in Google Scholar

[16] Choobbasti AJ, Farrokhzad F, Barari A. Prediction of slope stability using artificial neural network (case study: Noabad, Mazandaran, Iran). Arab J Geosci. 2009;2(4):311–9. 10.1007/s12517-009-0035-3.Search in Google Scholar

[17] Conforti M, Pascale S, Robustelli G, Sdao F. Evaluation of prediction capability of the artificial neural networks for mapping landslide susceptibility in the Turbolo River catchment (northern Calabria, Italy). Catena. 2014;113:236–50. 10.1016/j.catena.2013.08.006.Search in Google Scholar

[18] Gordan B, Armaghani DJ, Hajihassani M, Monjezi M. Prediction of seismic slope stability through combination of particle swarm optimization and neural network. Eng Comput. 2016;32(1):85–97. 10.1007/s00366-015-0400-7.Search in Google Scholar

[19] Huang FM, Cao ZS, Guo JF, Jiang SH, Li S, Guo ZZ. Comparisons of heuristic, general statistical and machine learning models for landslide susceptibility prediction and mapping. Catena. 2020;191:104580. 10.1016/j.catena.2020.104580.Search in Google Scholar

[20] He S, Pan P, Dai L, Wang H, Liu J. Application of kernel-based Fisher discriminant analysis to map landslide susceptibility in the Qinggan River delta, Three Gorges, China. Geomorphology. 2012;171:30–41. 10.1016/j.geomorph.2012.04.024.Search in Google Scholar

[21] Ramos-Cañón AM, Prada-Sarmiento LF, Trujillo-Vela MG, Macías JP, Santos-r AC. Linear discriminant analysis to describe the relationship between rainfall and landslides in Bogotá. Colomb Landslides. 2016;13(4):671–81. 10.1007/s10346-015-0593-2.Search in Google Scholar

[22] Pham BT, Prakash I. Evaluation and comparison of LogitBoost Ensemble, Fisher’s Linear Discriminant Analysis, logistic regression and support vector machines methods for landslide susceptibility mapping. Geocarto Int. 2019;34(3):316–33. 10.1080/10106049.2017.1404141.Search in Google Scholar

[23] Yeon YK, Han JG, Ryu KH. Landslide susceptibility mapping in Injae, Korea, using a decision tree. Eng Geol. 2010;116(3–4):274–83. 10.1016/j.enggeo.2010.09.009.Search in Google Scholar

[24] Bui DT, Pradhan B, Lofman O, Revhaug I. Landslide susceptibility assessment in Vietnam using support vector machines, decision tree, and Naive Bayes Models. Math Probl Eng. 2012;974638. 10.1155/2012/974638.Search in Google Scholar

[25] Pradhan B. A comparative study on the predictive ability of the decision tree, support vector machine and neuro-fuzzy models in landslide susceptibility mapping using GIS. Comput Geosci. 2013;51:350–65. 10.1016/j.cageo.2012.08.023.Search in Google Scholar

[26] Pourghasemi H, Pradhan B, Gokceoglu C, Moezzi KD. A comparative assessment of prediction capabilities of Dempster–Shafer and weights-of-evidence models in landslide susceptibility mapping using GIS. Geomat Nat Haz Risk. 2013;4(2):93–118. 10.1080/19475705.2012.662915.Search in Google Scholar

[27] Friedman JH. Multivariate adaptive regression splines. Ann Stat. 1991;19(1):1–67. 10.1214/aos/1176347963.Search in Google Scholar

[28] Kostić S, Vasović N, Todorović K, Samčović A. Application of artificial neural networks for slope stability analysis in geotechnical practice. In: Neural Networks and Applications (NEUREL). IEEE; 13th Symposium on November 2016. p. 1–6.10.1109/NEUREL.2016.7800125Search in Google Scholar

[29] Wasserman PD. Neural computing: theory and practice. Van Nostrand Reinhold Co; 1989.Search in Google Scholar

[30] Yu W, Cai JL, An FP. Slope displacement prediction model based on LMD and BP neural network. AMM. 2013;246:370–6. 10.4028/www.scientific.net/amm.246-247.370.Search in Google Scholar

[31] Xu K, Guo Q, Li Z, Xiao J, Qin Y, Chen D, et al. Landslide susceptibility evaluation based on BPNN and GIS: A case of Guojiaba in the Three Gorges Reservoir Area. Int J Geogr Inf Sci. 2015;29(7):1111–24. 10.1080/13658816.2014.992436.Search in Google Scholar

[32] Demuth H, Beale M. Matlab neural network toolbox user’s guide version 6. The MathWorks Inc.; 2009.Search in Google Scholar

[33] Chen HQ, Zeng ZG. Deformation prediction of landslide based on improved back-propagation neural network. Cogn Comput. 2013;5(1):56–62. 10.1007/s12559-012-9148-1.Search in Google Scholar

[34] Erdik T, Pektas AO. Rock slope damage level prediction by using multivariate adaptive regression splines (MARS). Neural Comput Applic. 2019;31:2269–78. 10.1007/s00521-017-3186-2.Search in Google Scholar

[35] Sah NK, Sheorey PR, Upadhyaya LN. Maximum likelihood estimation of slope stability. Int J Rock Mech Min Abstr. 1994;31(1):47–53. 10.1016/0148-9062(94)92314-0.Search in Google Scholar

[36] Lu P, Rosenbaum MS. Artificial neural networks and grey systems for the prediction of slope stability. Nat Hazards. 2003;30(3):383–98. 10.1023/B:NHAZ.0000007168.00673.27.Search in Google Scholar

[37] Sakellariou MG, Ferentinou MD. A study of slope stability prediction using neural networks. Geotech Geol Eng. 2005;23(4):419. 10.1007/s10706-004-8680-5.Search in Google Scholar

[38] Li J, Wang F. Study on the forecasting models of slope stability under data mining. In: Earth and Space 2010: Engineering, Science, Construction, and Operations in Challenging Environments; 2010. p. 765–76. 10.1061/41096(366)77.Search in Google Scholar

[39] Manouchehrian A, Gholamnejad J, Sharifzadeh M. Development of a model for analysis of slope stability for circular mode failure using genetic algorithm. Environ Earth Sci. 2014;71(3):1267–77. 10.1007/s12665-013-2576-8.Search in Google Scholar

[40] Jekabsons G. ARESLab Adaptive Regression Splines toolbox for Matlab/Octave. Ver. 1.13. 0. Riga Technical University; 2016.Search in Google Scholar

[41] Hoang ND, Pham AD. Hybrid artificial intelligence approach based on metaheuristic and machine learning for slope stability assessment: a multinational data analysis. Expert Syst Appl. 2016;46:60–8. 10.1016/j.eswa.2015.10.020.Search in Google Scholar

Received: 2019-10-02
Revised: 2020-09-24
Accepted: 2020-09-30
Published Online: 2020-11-06

© 2020 Zhihao Liao and Zhiwei Liao, published by De Gruyter

This work is licensed under the Creative Commons Attribution 4.0 International License.

Downloaded on 23.4.2024 from https://www.degruyter.com/document/doi/10.1515/geo-2020-0198/html
Scroll to top button