Academy of Mathematics and Systems Science, CAS Colloquia & Seminars | Speaker: | 雷金龙 研究员,同济大学 | Inviter: | | Title: | Distributed Variable Sample-size Stochastic Optimization with Fixed Step-sizes | Time & Venue: | 2021.12.16 15:00-15:30 腾讯会议:620335749 | Abstract: | The talk will introduce distributed stochastic optimization, where agents collaboratively minimize the average of all agents' local expectation-valued convex cost functions. Due to gradient noises and distributedness of local functions, fast convergent distributed algorithms under fixed step-sizes have not been achieved yet. This work incorporates a variance reduction scheme into the distributed gradient tracking algorithm, where local gradients are estimated by a variable batch of sampled gradients. For convex cost functions, we provide a sufficient condition for all agents' iterates to converge almost surely to the same optimal solution under fixed step-sizes. When the global cost function is strongly convex and the sampling batch increases at a geometric rate, we prove the geometrical convergence rate, and establish the iteration, oracle, and communication complexity for obtaining an optimal solution. The algorithm performance, including rate and complexity analysis, is further investigated with a constant batch and a polynomially increasing sampling batch, respectively. The results reveal the trade-off in distributed stochastic optimization that network communication burden can be relieved by exploiting the local computation resources. | | | |