Article ID Journal Published Year Pages File Type
4952519 Theoretical Computer Science 2016 19 Pages PDF
Abstract
An interesting property for curve length digital estimators is the convergence toward the continuous length and the associate convergence speed when the grid spacing tends to zero. On the one hand, DSS based estimators were proved to converge but only under some convexity and smoothness or polygonal assumptions. On the other hand, the sparse estimators were introduced in a previous paper by the authors and their convergence for Lipschitz functions was proved without convexity assumption. Here, a wider class of estimators, the non-local estimators, is defined that intends to gather sparse estimators and DSS based estimators. Their convergence is proved and an error upper bound for a large class of functions is given.
Related Topics
Physical Sciences and Engineering Computer Science Computational Theory and Mathematics
Authors
, ,