Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
1154999 | Statistics & Probability Letters | 2010 | 7 Pages |
Abstract
Logistic regression is the closest model, given its sufficient statistics, to the model of constant success probability in terms of Kullback–Leibler information. A generalized binary model has this property for the more general ϕϕ-divergence. These results generalize to multinomial and other discrete data.
Related Topics
Physical Sciences and Engineering
Mathematics
Statistics and Probability
Authors
Maria Kateri, Alan Agresti,