Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
8960222 | Pattern Recognition | 2019 | 40 Pages |
Abstract
We study sequential nonlinear regression and introduce an online algorithm that elegantly mitigates, via an adaptively incremental hierarchical structure, convergence and undertraining issues of conventional nonlinear regression methods. Particularly, we present a piecewise linear (or nonlinear) regression algorithm that partitions the regressor space and learns a linear model at each region to combine. Unlike the conventional approaches, our algorithm effectively learns the optimal regressor space partition with the desired complexity in a completely sequential and data driven manner. Our algorithm sequentially and asymptotically achieves the performance of the optimal twice differentiable regression function for any data sequence without any statistical assumptions. The introduced algorithm can be efficiently implemented with a computational complexity that is only logarithmic in the length of data. In our experiments, we demonstrate significant gains for the well-known benchmark real data sets when compared to the state-of-the-art techniques.
Keywords
Related Topics
Physical Sciences and Engineering
Computer Science
Computer Vision and Pattern Recognition
Authors
N. Denizcan Vanli, Muhammed O. Sayin, Mohammadreza Mohaghegh N., Huseyin Ozkan, Suleyman S. Kozat,