Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
1146548 | Journal of Multivariate Analysis | 2010 | 20 Pages |
Abstract
Stochastic modeling for large-scale datasets usually involves a varying-dimensional model space. This paper investigates the asymptotic properties, when the number of parameters grows with the available sample size, of the minimum-BD estimators and classifiers under a broad and important class of Bregman divergence (BD), which encompasses nearly all of the commonly used loss functions in the regression analysis, classification procedures and machine learning literature. Unlike the maximum likelihood estimators which require the joint likelihood of observations, the minimum-BD estimators are useful for a range of models where the joint likelihood is unavailable or incomplete. Statistical inference tools developed for the class of large dimensional minimum-BD estimators and related classifiers are evaluated via simulation studies, and are illustrated by analysis of a real dataset.
Keywords
Related Topics
Physical Sciences and Engineering
Mathematics
Numerical Analysis
Authors
Chunming Zhang,