Article ID Journal Published Year Pages File Type
715495 IFAC Proceedings Volumes 2014 6 Pages PDF
Abstract

We introduce Secant-Tangents AveRaged (STAR) Stochastic Approximation (SA), a new SA algorithm that estimates the gradient using a hybrid estimator, which is a convex combination of a symmetric finite difference and an average of two direct gradient estimators. For the deterministic weight sequence that minimizes the variance of the STAR gradient, we prove that for quadratic functions, the mean squared error (MSE) of the STAR-SA algorithm using this weight sequence is strictly less than that of the classical SA methods of Robbins-Monro (RM) and Kiefer-Wolfowitz (KW). We also prove convergence of the STAR-SA algorithm for general concave functions. Furthermore, we illustrate its effectiveness through numerical experiments by comparing the MSE of the STAR-SA algorithm against RM and KW for simple quadratic functions with various steepness and noise levels.

Related Topics
Physical Sciences and Engineering Engineering Computational Mechanics