Article ID Journal Published Year Pages File Type
6869656 Computational Statistics & Data Analysis 2015 14 Pages PDF
Abstract
The lasso and its variants have attracted much attention recently because of its ability of simultaneous estimation and variable selection. When some prior knowledge exists in applications, the performance of estimation and variable selection can be further improved by incorporating the prior knowledge as constraints on parameters. In this article, we consider linearly constrained generalized lasso, where the constraints are either linear inequalities or equalities or both. The dual of the problem is derived, which is a much simpler problem than the original one. As a by-product, a coordinate descent algorithm is feasible to solve the dual. A formula for the number of degrees of freedom is derived. The method for selecting tuning parameter is also discussed.
Related Topics
Physical Sciences and Engineering Computer Science Computational Theory and Mathematics
Authors
, , ,