Article ID Journal Published Year Pages File Type
529888 Pattern Recognition 2015 11 Pages PDF
Abstract

•We prove that group-wise ℓpℓp-regularization has algorithmic stability and thus generalizes.•We derive ADMM and FISTA algorithms for solving group-wise ℓpℓp-regularization.•We show that for p=5/4, 4/3, 3/2 and 2 the update step has analytical solution.•We demonstrate that ℓpℓp regularization achieves flexibility in denseness/sparseness modeling.•We show that group-wise ℓpℓp-regularization has state-of-the-art performance on splice detection.

Following advances in compressed sensing and high-dimensional statistics, many pattern recognition methods have been developed with ℓ1 regularization, which promotes sparse solutions. In this work, we instead advocate the use of ℓpℓp (2≥p>12≥p>1) regularization in a group setting which provides a better trade-off between sparsity and algorithmic stability. We focus on the simplest case with squared loss, which is known as group bridge regression. On the theoretical side, we prove that group bridge regression is uniformly stable and thus generalizes, which is an important property of a learning method. On the computational side, we make group bridge regression more practically attractive by deriving provably convergent and computationally efficient optimization algorithms. We show that there are at least several values of p   over (1,2) at which the iterative update is analytical, thus it is even suitable for large-scale settings. We demonstrate the clear advantage of group bridge regression with the proposed algorithms over other competitive alternatives on several datasets. As ℓpℓp-regularization allows one to achieve flexibility in sparseness/denseness of the solution, we hope that the algorithms will be useful for future applications of this regularization.

Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
,