Article ID Journal Published Year Pages File Type
438403 Theoretical Computer Science 2014 15 Pages PDF
Abstract

Structured output prediction is an important machine learning problem both in theory and practice, and the max-margin Markov network (M3N) is an effective approach. All state-of-the-art algorithms for optimizing M3N objectives take at least O(1/ϵ)O(1/ϵ) number of iterations to find an ϵ accurate solution. Nesterov [1] broke this barrier by proposing an excessive gap reduction technique (EGR) which converges in O(1/ϵ) iterations. However, it is restricted to Euclidean projections which consequently requires an intractable amount of computation for each iteration when applied to solve M3N. In this paper, we show that by extending EGR to Bregman projection, this faster rate of convergence can be retained, and more importantly, the updates can be performed efficiently by exploiting graphical model factorization. Further, we design a kernelized procedure which allows all computations per iteration to be performed at the same cost as the state-of-the-art approaches.

Related Topics
Physical Sciences and Engineering Computer Science Computational Theory and Mathematics
Authors
, , ,