当前位置: X-MOL 学术Comput. Math. Math. Phys. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Prior Distribution Selection for a Mixture of Experts
Computational Mathematics and Mathematical Physics ( IF 0.7 ) Pub Date : 2021-08-22 , DOI: 10.1134/s0965542521070071
A. V. Grabovoy 1 , V. V. Strijov 2
Affiliation  

Abstract

The paper investigates a mixture of expert models. The mixture of experts is a combination of experts, local approximation model, and a gate function, which weighs these experts and forms their ensemble. In this work, each expert is a linear model. The gate function is a neural network with softmax on the last layer. The paper analyzes various prior distributions for each expert. The authors propose a method that takes into account the relationship between prior distributions of different experts. The EM algorithm optimises both parameters of the local models and parameters of the gate function. As an application problem, the paper solves a problem of shape recognition on images. Each expert fits one circle in an image and recovers its parameters: the coordinates of the center and the radius. The computational experiment uses synthetic and real data to test the proposed method. The real data is a human eye image from the iris detection problem.



中文翻译:

混合专家的先验分布选择

摘要

本文研究了专家模型的混合。专家的混合是专家、局部逼近模型和门函数的组合,门函数对这些专家进行加权并形成他们的集成。在这项工作中,每个专家都是一个线性模型。门函数是最后一层带有 softmax 的神经网络。该论文分析了每个专家的各种先验分布。作者提出了一种方法,该方法考虑了不同专家的先验分布之间的关系。EM 算法优化了局部模型的参数和门函数的参数。作为一个应用问题,论文解决了图像上的形状识别问题。每个专家在图像中拟合一个圆并恢复其参数:中心坐标和半径。计算实验使用合成数据和真实数据来测试所提出的方法。真实数据是来自虹膜检测问题的人眼图像。

更新日期:2021-08-23
down
wechat
bug