Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
4600427 | Linear Algebra and its Applications | 2013 | 14 Pages |
Linear systems where both the matrix and the right hand side are affected by noise or uncertainties are typically solved via a total least squares (TLS) approach. We study the sparse TLS problem, with solutions having s nonzero elements. We present a set of conditions under which the sparse least squares (LS) and TLS problems have the same support. The conditions are expressed in terms of the restricted isometry constants and the minimum nonzero principal angle between subspaces made of s columns of the matrix. We also study several greedy algorithms for solving the sparse TLS problem. Inspired by our theoretical results, we infer that an effective approach is to compute the support of the LS solution, then solve the standard TLS problem restrained to that support. Indeed, doing so with orthogonal matching pursuit (OMP) gives consistently good results. Another greedy strategy is to extend the support with the column that minimizes the smallest angle with the subspace (SAS) corresponding to the already chosen support, which is a combined LS-TLS heuristic. We show via simulations that SAS gives the best results for small problem sizes and high and medium signal-to-noise ratio (SNR), while OMP is best or nearly best in almost all cases.