Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
404124 | Neural Networks | 2013 | 5 Pages |
Abstract
The problem of assessing the performance of a classifier, in the finite-sample setting, has been addressed by Vapnik in his seminal work by using data-independent measures of complexity. Recently, several authors have addressed the same problem by proposing data-dependent measures, which tighten previous results by taking in account the actual data distribution. In this framework, we derive some data-dependent bounds on the generalization ability of a classifier by exploiting the Rademacher Complexity and recent concentration results: in addition of being appealing for practical purposes, as they exploit empirical quantities only, these bounds improve previously known results.
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Luca Oneto, Alessandro Ghio, Davide Anguita, Sandro Ridella,