Article ID Journal Published Year Pages File Type
6940108 Pattern Recognition Letters 2018 10 Pages PDF
Abstract
This paper introduces the kernel signal-to-noise ratio (kSNR) for different machine learning and signal processing applications. The kSNR seeks to maximize the signal variance while minimizing the estimated noise variance explicitly in a reproducing kernel Hilbert space (rkHs). The kSNR gives rise to considering complex signal-to-noise relations beyond additive noise models, and can be seen as a useful regularizer for feature extraction and dimensionality reduction. We show that the kSNR generalizes kernel PCA (and other spectral dimensionality reduction methods), least squares SVM, and kernel ridge regression to deal with cases where signal and noise cannot be assumed independent. We give computationally efficient alternatives based on reduced-rank Nyström and projection on random Fourier features approximations, and analyze the bounds of performance and its stability. We illustrate the method through different examples, including nonlinear regression, nonlinear classification in channel equalization, nonlinear feature extraction from high-dimensional spectral satellite images, and bivariate causal inference. Experimental results show that the proposed kSNR yields more accurate solutions and extracts more noise-free features when compared to standard approaches.
Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, , ,