Article ID Journal Published Year Pages File Type
409422 Neurocomputing 2006 16 Pages PDF
Abstract

Capturing dependencies in images in an unsupervised manner is important for many image-processing applications and for understanding the structure of natural image signals. Data generative linear models such as principal component analysis (PCA) and independent component analysis (ICA) have shown to capture low level features such as oriented edges in images. However, those models capture only linear dependency and therefore its modeling capability is limited. We propose a new method for capturing nonlinear dependencies in images of natural scenes. This method is an extension of the linear ICA method and builds on a hierarchical representation. The model makes use of lower level linear ICA representation and a subsequent mixture of Laplacian distribution for learning the nonlinear dependencies in an image. The model parameters are learned via the expectation maximization (EM) algorithm and it can accurately capture variance correlation and other high order structures in a simple and consistent manner. We visualize the learned variance correlation structure and demonstrate applications to automatic image segmentation and image denoising.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, ,