Article ID Journal Published Year Pages File Type
552409 Decision Support Systems 2008 11 Pages PDF
Abstract

Feature selection is critical to knowledge-based authentication. In this paper, we adopt a wrapper method in which the learning machine is a generative probabilistic model, and the objective is to maximize the Kullback–Leibler divergence between the true empirical distribution defined by the legitimate knowledge and the approximating distribution representing an attacking strategy, both in the same feature space. The closed-form solutions to this optimization problem lead to three adaptive algorithms, unified under the principle of maximum entropy. Our experimental results show that the proposed adaptive methods are superior to the commonly used random selection method.

Related Topics
Physical Sciences and Engineering Computer Science Information Systems
Authors
, ,