Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
8960138 | Neurocomputing | 2018 | 13 Pages |
Abstract
The one-bit quantization is implemented by one single comparator that operates at low power and a high rate. Hence one-bit compressive sensing (1bit-CS) becomes attractive in signal processing. When measurements are corrupted by noise during signal acquisition and transmission, 1bit-CS is usually modeled as minimizing a loss function with a sparsity constraint. The one-sided â1 loss and the linear loss are two popular loss functions for 1bit-CS. To improve the decoding performance on noisy data, we consider the pinball loss, which provides a bridge between the one-sided â1 loss and the linear loss. Using the pinball loss, two convex models, an elastic-net pinball model and its modification with the â1-norm constraint, are proposed. To efficiently solve them, the corresponding dual coordinate ascent algorithms are designed and their convergence is proved. The numerical experiments confirm the effectiveness of the proposed algorithms and the performance of the pinball loss minimization for 1bit-CS.
Keywords
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Xiaolin Huang, Lei Shi, Ming Yan, Johan A.K. Suykens,