Article ID Journal Published Year Pages File Type
479260 European Journal of Operational Research 2016 11 Pages PDF
Abstract

•We consider a nonsmooth convex optimization problem with fixed point constraints.•We propose the Halpern-type incremental proximal algorithm for solving the problem.•We propose the Mann-type incremental proximal algorithm for solving the problem.•We present their convergence analyses for a diminishing step size.•We give numerical examples to support the convergence analyses.

The problem of minimizing the sum of nonsmooth, convex objective functions defined on a real Hilbert space over the intersection of fixed point sets of nonexpansive mappings, onto which the projections cannot be efficiently computed, is considered. The use of proximal point algorithms that use the proximity operators of the objective functions and incremental optimization techniques is proposed for solving the problem. With the focus on fixed point approximation techniques, two algorithms are devised for solving the problem. One blends an incremental subgradient method, which is a useful algorithm for nonsmooth convex optimization, with a Halpern-type fixed point iteration algorithm. The other is based on an incremental subgradient method and the Krasnosel’skiĭ–Mann fixed point algorithm. It is shown that any weak sequential cluster point of the sequence generated by the Halpern-type algorithm belongs to the solution set of the problem and that there exists a weak sequential cluster point of the sequence generated by the Krasnosel’skiĭ–Mann-type algorithm, which also belongs to the solution set. Numerical comparisons of the two proposed algorithms with existing subgradient methods for concrete nonsmooth convex optimization show that the proposed algorithms achieve faster convergence.

Related Topics
Physical Sciences and Engineering Computer Science Computer Science (General)
Authors
,