Computes the Riemannian gradient and the cost at x via AD in one call for fixed-rank matices with an embedded geometry grad = costgradcomputefixedrankembedded(problem,x) The first-order method follows the paper: "Automatic differentiation for Riemannian optimization on low-rank matrix and tensor-train manifolds", A. Novikov, M. Rakhuba, I. Oseledets, 2021 Paper link: https://arxiv.org/pdf/2103.14974.pdf Please cite the Manopt paper as well as the research paper: @Misc{novikov2021automatic, Title = {Automatic differentiation for Riemannian optimization on low-rank matrix and tensor-train manifolds}, Author = {Alexander Novikov and Maxim Rakhuba and Ivan Oseledets}, Year = {2021}, Eprint = {2103.14974}, ArchivePrefix = {arXiv}, PrimaryClass = {math.OC} } See also: autograd
0001 function [y,grad] = costgradcomputefixedrankembedded(problem,x) 0002 % Computes the Riemannian gradient and the cost at x via AD in one call for 0003 % fixed-rank matices with an embedded geometry 0004 % 0005 % grad = costgradcomputefixedrankembedded(problem,x) 0006 % 0007 % The first-order method follows the paper: 0008 % "Automatic differentiation for Riemannian optimization on low-rank matrix 0009 % and tensor-train manifolds", A. Novikov, M. Rakhuba, I. Oseledets, 2021 0010 % 0011 % Paper link: https://arxiv.org/pdf/2103.14974.pdf 0012 % 0013 % Please cite the Manopt paper as well as the research paper: 0014 % @Misc{novikov2021automatic, 0015 % Title = {Automatic differentiation for Riemannian optimization on low-rank matrix and tensor-train manifolds}, 0016 % Author = {Alexander Novikov and Maxim Rakhuba and Ivan Oseledets}, 0017 % Year = {2021}, 0018 % Eprint = {2103.14974}, 0019 % ArchivePrefix = {arXiv}, 0020 % PrimaryClass = {math.OC} 0021 % } 0022 % 0023 % See also: autograd 0024 0025 % This file is part of Manopt: www.manopt.org. 0026 % Original author: Xiaowen Jiang, Aug. 31, 2021. 0027 % Contributors: 0028 % Change log: 0029 0030 % check availability 0031 assert(isfield(problem,'autogradfunc'),['the problem structure must'..., 0032 ' contain the field autogradfunc, see autograd.']) 0033 assert(sum(isfield(x,{'U','S','V'}))==3 &&..., 0034 (contains(problem.M.name(),'rank','IgnoreCase',true)) &&..., 0035 (~startsWith(problem.M.name(),'Product manifold')),['The manifold'... 0036 'must be fixed-rank matices with an embedded geometry']); 0037 0038 % convert A,B into dlarrays to prepare for AD 0039 A = mat2dl(x.U*x.S); B = mat2dl(x.V*x.S); 0040 0041 % compute cost and egrad according to autogradfunc 0042 [g1,egrad] = dlfeval(problem.autogradfunc,x,A,B); 0043 y = dl2mat(g1); 0044 0045 % compute grad 0046 Udelta = dl2mat(egrad.A); Vdelta = dl2mat(egrad.B); 0047 grad.M = x.U'*Udelta; 0048 grad.Up = Udelta - x.U*((x.U)'*Udelta); 0049 grad.Vp = Vdelta - x.V*((x.V)'*Vdelta); 0050 0051 end