当前位置: X-MOL 学术Artif. Intell. Rev. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Tangent-cut optimizer on gradient descent: an approach towards Hybrid Heuristics
Artificial Intelligence Review ( IF 12.0 ) Pub Date : 2021-03-17 , DOI: 10.1007/s10462-021-09984-0
Saptarshi Biswas , Subhrapratim Nath , Sumagna Dey , Utsha Majumdar

The world has witnessed a surfeit of usage of Artificial Intelligence systems for a long time. Nowadays, most of the problems are transforming from logical solutions into statistical domains. This requires the implementation of machine learning algorithms to mine useful data from the statistical datasets which in turn demands high-end computing. Generally, machine learning algorithms utilize Gradient Descent as a tool to find the optimal solution of computationally expensive problems. This gave rise to the development of optimization algorithms like Momentum, RMSProp, Adam and the like, which could speed up the convergence to the global optimum besides increasing the learning accuracy. However, nowadays the supervised machine learning models got more data intensive which increased their computational cost, putting the efficiency of these algorithms into question. In this context, a new optimization algorithm namely, the Tangent-Cut Optimizer (TC-Opt) has been proposed which can converge faster than the traditional optimization algorithms for supervised machine learning models. Furthermore, the proposed work brings forward a phenomenon that intertwines the statistical and logical decision-making model into a single unit while shedding light on a new heuristic approach named “Hybrid Heuristics”. The proposed algorithm has been implemented on the standard dataset of Boston House Pricing Dataset for linear regression and MNIST image dataset of handwritten digits from 0 to 9 for logistic regression and its performance has been compared with the existing algorithms. Finally, the robustness and high accuracy of the proposed optimization algorithm have been proved and demonstrated in the presentation.



中文翻译:

梯度下降的切线切割优化器:混合启发式方法

长期以来,全世界目睹了人工智能系统的大量使用。如今,大多数问题正在从逻辑解决方案转变为统计领域。这需要实施机器学习算法,以从统计数据集中挖掘有用的数据,而这又需要高端计算。通常,机器学习算法利用梯度下降作为工具来找到计算上昂贵的问题的最佳解决方案。这催生了诸如Momentum,RMSProp,Adam之类的优化算法的开发,这些算法除了提高学习准确性外,还可以加快收敛到全局最优的速度。但是,如今,受监督的机器学习模型越来越需要大量数据,这增加了它们的计算成本,质疑这些算法的效率。在这种情况下,提出了一种新的优化算法,即切线切割优化器(TC-Opt),该算法可以比传统的有监督机器学习模型的优化算法更快地收敛。此外,提出的工作提出了一种现象,该现象将统计和逻辑决策模型交织为一个单元,同时阐明了一种名为“混合启发式”的新启发式方法。所提出的算法已在波士顿房屋定价数据集的标准数据集上进行了线性回归,并在MNIST手写数字从0到9的图像数据集上进行了逻辑回归,并将其性能与现有算法进行了比较。最后,

更新日期:2021-03-18
down
wechat
bug