An Accelerated Stochastic Trust Region Method for Stochastic Optimization
DOI:
https://doi.org/10.4208/jcm.2504-m2023-0228Keywords:
Stochastic optimization, Stochastic variance reduced gradient, Trust region, Gradient descent method, Machine learning.Abstract
In this paper, we propose an accelerated stochastic variance reduction gradient method with a trust-region-like framework, referred as the NMSVRG-TR method. Based on NMSVRG, we incorporate a Katyusha-like acceleration step into the stochastic trust region scheme, which improves the convergence rate of the SVRG methods. Under appropriate assumptions, the linear convergence of the algorithm is provided for strongly convex objective functions. Numerical experiment results show that our algorithm is generally superior to some existing stochastic gradient methods.
Downloads
Published
2025-09-28
Issue
Section
Articles