当前位置: X-MOL 学术J. Exp. Theor. Artif. Intell. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A tuned feed-forward deep neural network algorithm for effort estimation
Journal of Experimental & Theoretical Artificial Intelligence ( IF 2.2 ) Pub Date : 2021-02-05 , DOI: 10.1080/0952813x.2021.1871664
Muhammed Maruf Öztürk 1
Affiliation  

ABSTRACT

Software effort estimation (SEE) is a software engineering problem that requires robust predictive models. To establish robust models, the most feasible configuration of hyperparameters of regression methods is searched. Although only a few works, which include hyperparameter optimisation (HO), have been done so far for SEE, there is not any comprehensive study including deep learning models. In this study, a feed-forward deep neural network algorithm (FFDNN) is proposed for software effort estimation. The algorithm relies on a binary-search-based method for finding hyperparameters. FFDNN outperforms five comparison algorithms in the experiment that uses two performance parameters. The results of the study suggest that: 1) Employing traditional methods such as grid and random search increases tuning time remarkably. Instead, sophisticated parameter search methods compatible with the structure of regression method should be developed; 2) The performance of SEE is enhanced when associated hyperparameter search method is devised according to the essentials of chosen deep learning approach; 3) Deep learning models achieve in competitive CPU time compared to the tree-based regression methods such as CART_DE8.



中文翻译:

一种用于努力估计的调谐前馈深度神经网络算法

摘要

软件工作量估计 (SEE) 是一个软件工程问题,需要强大的预测模型。为了建立稳健的模型,搜索回归方法的超参数最可行的配置。尽管到目前为止 SEE 只完成了一些包括超参数优化 (HO) 的工作,但还没有任何包括深度学习模型在内的全面研究。在这项研究中,提出了一种用于软件工作量估计的前馈深度神经网络算法 (FFDNN)。该算法依赖于基于二进制搜索的方法来查找超参数。在使用两个性能参数的实验中,FFDNN 的性能优于五种比较算法。研究结果表明:1)采用网格和随机搜索等传统方法显着增加了调整时间。反而,应开发与回归方法结构兼容的复杂参数搜索方法;2)根据所选择的深度学习方法的本质,设计关联的超参数搜索方法,提高了SEE的性能;3) 与 CART_DE8 等基于树的回归方法相比,深度学习模型在 CPU 时间上具有竞争力。

更新日期:2021-02-05
down
wechat
bug