Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
6863125 | Neural Networks | 2018 | 8 Pages |
Abstract
The largest family of density-ratio based estimators is obtained for unnormalized statistical models under the assumption of properness. They do not require normalization of the probability density function (PDF) because they are based on the density ratio of the same PDF at different points; therefore, the multiplicative normalization constant cancels out. In contrast with most existing work, a single necessary and sufficient condition is given here, rather than merely sufficient conditions for proper criteria for estimation. The condition implies that an extended Bregman divergence framework with data-dependent noise (Gutmann & Hirayama, 2011) gives the largest family of proper criteria in the present case. This properness yields consistent estimation as long as some mild conditions are satisfied. The present study shows that the above-mentioned framework gives an “upper bound” for attempts to extend Hyvärinen's score matching and therefore provides a perspective for studies in this direction.
Keywords
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Kazuyuki Hiraoka, Toshihiko Hamada, Gen Hori,