节点文献
基于梯度跟踪和分布式重球加速的分布式随机优化算法(英文)
A distributed stochastic optimization algorithm with gradient-tracking and distributed heavy-ball acceleration
【摘要】 由于在机器学习和信号处理中的广泛应用,近年来分布式优化得到良好发展。本文致力于研究分布式优化以求解目标函数全局最小值。该目标是分布在n个节点的无向网络上的平滑且强凸的局部成本函数总和。与已有工作不同的是,我们使用分布式重球项以提高算法的收敛性能。为使现有分布式随机一阶梯度算法的收敛加速,将动量项与梯度跟踪技术结合。仿真结果表明,在不增加复杂度的情况下,所提算法具有比GT-SAGA更高收敛速率。在真实数据集上的数值实验证明了该算法的有效性和正确性。
【Abstract】 Distributed optimization has been well developed in recent years due to its wide applications in machine learning and signal processing. In this paper, we focus on investigating distributed optimization to minimize a global objective. The objective is a sum of smooth and strongly convex local cost functions which are distributed over an undirected network of n nodes. In contrast to existing works, we apply a distributed heavy-ball term to improve the convergence performance of the proposed algorithm. To accelerate the convergence of existing distributed stochastic first-order gradient methods, a momentum term is combined with a gradient-tracking technique. It is shown that the proposed algorithm has better acceleration ability than GT-SAGA without increasing the complexity. Extensive experiments on real-world datasets verify the effectiveness and correctness of the proposed algorithm.
【Key words】 Distributed optimization; High-performance algorithm; Multi-agent system; Machine-learning problem; Stochastic gradient;
- 【文献出处】 Frontiers of Information Technology & Electronic Engineering ,信息与电子工程前沿(英文) , 编辑部邮箱 ,2021年11期
- 【分类号】TP18
- 【下载频次】66