文档详情

《Online learning in the embedded manifold of low-rank matrices》.pdf

发布:2015-10-17约11.7万字共30页下载文档
文本预览下载声明
Journal of Machine Learning Research 13 (2012) 429-458 Submitted 6/11; Revised 11/11; Published 2/12 Online Learning in the Embedded Manifold of Low-rank Matrices Uri Shalit∗ URI .SHALIT@MAIL .HUJI .AC .IL Computer Science Department and ICNC/ELSC The Hebrew University of Jerusalem 91904 Jerusalem, Israel Daphna Weinshall DAPHNA @CS .HUJI .AC .IL Computer Science Department The Hebrew University of Jerusalem 91904 Jerusalem, Israel Gal Chechik† GAL @GOOGLE .COM The Gonda Brain Research Center Bar Ilan University 52900 Ramat-Gan, Israel Editor: Léon Bottou Abstract When learning models that are represented in matrix forms, enforcing a low-rank constraint can dramatically improve the memory and run time complexity, while providing a natural regularization of the model. However, naive approaches to minimizing functions over the set of low-rank matrices are either prohibitively time consuming (repeated singular value decomposition of the matrix) or numerically unstable (optimizing a factored representation of the low-rank matrix). We build on recent advances in optimization over manifolds, and describe an iterative online learning procedure, consisting of a gradient step, followed by a second-order retraction back to the manifold. While the ideal retraction is costly to compute, and so is the projection operator that approximates it, we describe another retraction that can be computed efficiently. It has run time and memory complexity of O ((n +m)k) for a rank-k matrix of dimension m ×n, when using an online procedure with rank-one gradients.
显示全部
相似文档