当前位置: X-MOL 学术IEEE Wirel. Commun. Lett. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A Learning-Aided Flexible Gradient Descent Approach to MISO Beamforming
IEEE Wireless Communications Letters ( IF 4.6 ) Pub Date : 6-24-2022 , DOI: 10.1109/lwc.2022.3186160
Zhixiong Yang 1 , Jing-Yuan Xia 1 , Junshan Luo 1 , Shuanghui Zhang 1 , Deniz Gunduz 2
Affiliation  

This letter proposes a learning aided gradient descent (LAGD) algorithm to solve the weighted sum rate (WSR) maximization problem for multiple-input single-output (MISO) beamforming. The proposed LAGD algorithm directly optimizes the transmit precoder through implicit gradient descent based iterations, at each of which the optimization strategy is determined by a neural network, and thus, is dynamic and adaptive. At each instance of the problem, this network is initialized randomly, and updated throughout the iterative solution process. Therefore, the LAGD algorithm can be implemented at any signal-to-noise ratio (SNR) and for arbitrary antenna/user numbers, does not require labelled data or training prior to deployment. Numerical results show that the LAGD algorithm can outperform of the well-known WMMSE algorithm as well as other learning-based solutions with a modest computational complexity. Our code is available at https://github.com/XiaGroup/LAGD .

中文翻译:


MISO 波束形成的学习辅助灵活梯度下降方法



这封信提出了一种学习辅助梯度下降(LAGD)算法来解决多输入单输出(MISO)波束成形的加权和率(WSR)最大化问题。所提出的 LAGD 算法通过基于隐式梯度下降的迭代直接优化传输预编码器,每次迭代的优化策略均由神经网络确定,因此是动态和自适应的。在问题的每个实例中,该网络都是随机初始化的,并在整个迭代求解过程中进行更新。因此,LAGD 算法可以在任何信噪比 (SNR) 和任意天线/用户数量下实现,在部署之前不需要标记数据或训练。数值结果表明,LAGD 算法的性能优于著名的 WMMSE 算法以及其他基于学习的解决方案,且计算复杂度适中。我们的代码位于https://github.com/XiaGroup/LAGD 。
更新日期:2024-08-26
down
wechat
bug