Article ID Journal Published Year Pages File Type
9653420 Neurocomputing 2005 25 Pages PDF
Abstract
Simplicity and efficiency of linear transformations make them a popular tool for extracting features and reducing dimension before or during statistical analysis of large datasets. Examples of their applications include image compression and reconstruction, discriminant analysis, pattern classification, and image or text retrieval. Linear transformations with natural orthogonality constraints can be represented as elements of Stiefel and Grassmann manifolds. We advocate that the choice of a transformation for dimension reduction is not standard; it is dictated by the application and the data set, and can be formulated as an optimization problem on these above-mentioned manifolds. We demonstrate this idea by deriving dimension-reducing transformations in several applications, including image-based recognition of objects and content-based retrieval of images.
Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, ,