Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
4600675 | Linear Algebra and its Applications | 2013 | 16 Pages |
Abstract
Reduced rank approximations to symmetric tensors find use in data compaction and in multi-user blind source separation. We derive iterative algorithms which feature monotonic convergence to a minimum of a Frobenius norm approximation criterion, for a certain rank-r Tucker product version of the approximation problem. The approach exploits the gradient inequality for convex functions to establish monotonic convergence, while sparing the cumbersome step size analysis required from a manifold gradient approach. It likewise overcomes some limitations of symmetric versions of alternating least-squares. The computational load per iteration amounts to computing an unfolded matrix and a QR decomposition.
Related Topics
Physical Sciences and Engineering
Mathematics
Algebra and Number Theory