Article ID Journal Published Year Pages File Type
408854 Neurocomputing 2016 11 Pages PDF
Abstract

•A novel simple and efficient algorithm BCDDL is proposed to solve SC–DL problems.•BCDDL is the fastest algorithm in solving SC–DL to date.•BCDDL shows state-of-the-art performance when learning bases with small samples.•BCDDL shows state-of-the-art performance when pursuing comparatively much sparser codes.•BCDDL achieves superior performance in image classification tasks.

Sparse representation based dictionary learning, which is usually viewed as a method for rearranging the structure of the original data in order to make the energy compact over non-orthogonal and over-complete dictionary, is widely used in signal processing, pattern recognition, machine learning, statistics, and neuroscience. The current sparse representation framework decouples the optimization problem as two subproblems, i.e., alternate sparse coding and dictionary learning using different optimizers, treating elements in dictionary and codes separately. In this paper, we treat elements both in dictionary and codes homogenously. The original optimization is directly decoupled as several blockwise alternate subproblems rather than the above two. Hence, sparse coding and dictionary learning optimizations are unified together. More precisely, the variables involved in the optimization problem are partitioned into several suitable blocks with convexity preserved, making it possible to perform an exact blockwise coordinate descent. For each separable subproblem, based on the convexity and monotonic property of the parabolic function, a closed-form solution is obtained. The algorithm is thus simple, efficient, and effective. Experimental results show that our algorithm significantly accelerates the learning process. An application to image classification further demonstrates the efficiency of our proposed optimization strategy.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , , , ,