Home > manopt > autodiff > gradcomputefixedrankembedded.m

gradcomputefixedrankembedded

PURPOSE ^

Computes the Riemannian gradient of the cost function at x via AD for

SYNOPSIS ^

function grad = gradcomputefixedrankembedded(problem,x)

DESCRIPTION ^

 Computes the Riemannian gradient of the cost function at x via AD for
 fixed-rank matices with an embedded geometry

 grad = gradcomputefixedrankembedded(problem,x)

 The first-order method follows the paper: 
 "Automatic differentiation for Riemannian optimization on low-rank matrix 
 and tensor-train manifolds", A. Novikov, M. Rakhuba, I. Oseledets, 2021

 Paper link: https://arxiv.org/pdf/2103.14974.pdf

 Please cite the Manopt paper as well as the research paper:
    @Misc{novikov2021automatic,
      Title         = {Automatic differentiation for Riemannian optimization on low-rank matrix and tensor-train manifolds}, 
      Author        = {Alexander Novikov and Maxim Rakhuba and Ivan Oseledets},
      Year          = {2021},
      Eprint        = {2103.14974},
      ArchivePrefix = {arXiv},
      PrimaryClass  = {math.OC}
    }

 See also: autograd

CROSS-REFERENCE INFORMATION ^

This function calls: This function is called by:

SOURCE CODE ^

0001 function grad = gradcomputefixedrankembedded(problem,x)
0002 % Computes the Riemannian gradient of the cost function at x via AD for
0003 % fixed-rank matices with an embedded geometry
0004 %
0005 % grad = gradcomputefixedrankembedded(problem,x)
0006 %
0007 % The first-order method follows the paper:
0008 % "Automatic differentiation for Riemannian optimization on low-rank matrix
0009 % and tensor-train manifolds", A. Novikov, M. Rakhuba, I. Oseledets, 2021
0010 %
0011 % Paper link: https://arxiv.org/pdf/2103.14974.pdf
0012 %
0013 % Please cite the Manopt paper as well as the research paper:
0014 %    @Misc{novikov2021automatic,
0015 %      Title         = {Automatic differentiation for Riemannian optimization on low-rank matrix and tensor-train manifolds},
0016 %      Author        = {Alexander Novikov and Maxim Rakhuba and Ivan Oseledets},
0017 %      Year          = {2021},
0018 %      Eprint        = {2103.14974},
0019 %      ArchivePrefix = {arXiv},
0020 %      PrimaryClass  = {math.OC}
0021 %    }
0022 %
0023 % See also: autograd
0024 
0025 % This file is part of Manopt: www.manopt.org.
0026 % Original author: Xiaowen Jiang, Aug. 31, 2021.
0027 % Contributors:
0028 % Change log:
0029 
0030     % check availability
0031     assert(isfield(problem,'autogradfunc'),['the problem structure must'...,
0032         ' contain the field autogradfunc, see autograd.'])
0033     assert(sum(isfield(x,{'U','S','V'}))==3 &&..., 
0034         (contains(problem.M.name(),'rank','IgnoreCase',true)) &&...,
0035         (~startsWith(problem.M.name(),'Product manifold')),['The manifold'...
0036         'must be fixed-rank matices with an embedded geometry']);
0037 
0038     % convert A,B into dlarrays to prepare for AD
0039     A = mat2dl(x.U*x.S); B = mat2dl(x.V*x.S);
0040     
0041     % compute egrad according to autogradfunc
0042     [~,egrad] = dlfeval(problem.autogradfunc,x,A,B);
0043     
0044     % compute grad
0045     Udelta = dl2mat(egrad.A); Vdelta = dl2mat(egrad.B);
0046     grad.M = x.U'*Udelta;
0047     grad.Up = Udelta - x.U*((x.U)'*Udelta);
0048     grad.Vp = Vdelta - x.V*((x.V)'*Vdelta);
0049 
0050 end

Generated on Fri 30-Sep-2022 13:18:25 by m2html © 2005