当前位置: X-MOL 学术arXiv.cs.SE › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
SE3M: A Model for Software Effort Estimation Using Pre-trained Embedding Models
arXiv - CS - Software Engineering Pub Date : 2020-06-30 , DOI: arxiv-2006.16831
Eliane M. De Bortoli F\'avero and Dalcimar Casanova and Andrey Ricardo Pimentel

Estimating effort based on requirement texts presents many challenges, especially in obtaining viable features to infer effort. Aiming to explore a more effective technique for representing textual requirements to infer effort estimates by analogy, this paper proposes to evaluate the effectiveness of pre-trained embeddings models. For this, two embeddings approach, context-less and contextualized models are used. Generic pre-trained models for both approaches went through a fine-tuning process. The generated models were used as input in the applied deep learning architecture, with linear output. The results were very promising, realizing that pre-trained incorporation models can be used to estimate software effort based only on requirements texts. We highlight the results obtained to apply the pre-trained BERT model with fine-tuning in a single project repository, whose value is the Mean Absolute Error (MAE) is 4.25 and the standard deviation of only 0.17, which represents a result very positive when compared to similar works. The main advantages of the proposed estimation method are reliability, the possibility of generalization, speed, and low computational cost provided by the fine-tuning process, and the possibility to infer new or existing requirements.

中文翻译:

SE3M:使用预训练嵌入模型的软件工作量估计模型

根据需求文本估算工作量面临许多挑战,尤其是在获得可行的特征来推断工作量方面。为了探索一种更有效的技术来表示文本需求以通过类比推断工作量估计,本文建议评估预训练嵌入模型的有效性。为此,使用了两种嵌入方法,即无上下文和上下文化模型。两种方法的通用预训练模型都经过了微调过程。生成的模型用作应用深度学习架构的输入,具有线性输出。结果非常有希望,意识到预训练的合并模型可用于仅根据需求文本估计软件工作量。我们强调了在单个项目存储库中应用预训练的 BERT 模型和微调所获得的结果,其值是平均绝对误差 (MAE) 为 4.25,标准偏差仅为 0.17,这表示结果非常积极与同类作品相比。所提出的估计方法的主要优点是可靠性、泛化的可能性、速度和微调过程提供的低计算成本,以及推断新的或现有的需求的可能性。
更新日期:2020-07-01
down
wechat
bug