Home > manopt > autodiff > basic_examples_AD > complextest_AD3.m

complextest_AD3

PURPOSE ^

Test AD for a complex optimization problem on a manifold which is stored

SYNOPSIS ^

function complextest_AD3()

DESCRIPTION ^

 Test AD for a complex optimization problem on a manifold which is stored 
 in a particular data structure which is recursively defined by a struct, 
 an array and a cell.

CROSS-REFERENCE INFORMATION ^

This function calls: This function is called by:

SOURCE CODE ^

0001 function complextest_AD3()
0002 % Test AD for a complex optimization problem on a manifold which is stored
0003 % in a particular data structure which is recursively defined by a struct,
0004 % an array and a cell.
0005 
0006     % Verify that Manopt was indeed added to the Matlab path.
0007     if isempty(which('spherecomplexfactory'))
0008         error(['You should first add Manopt to the Matlab path.\n' ...
0009                'Please run importmanopt.']);
0010     end
0011     
0012     % Verify that the deep learning tool box was installed
0013     assert(exist('dlarray', 'file') == 2, ['Deep learning tool box is '... 
0014     'needed for automatic differentiation.\n Please install the'...
0015     'latest version of the deep learning tool box and \nupgrade to Matlab'...
0016     ' R2021b if possible.'])
0017     
0018     % Generate the problem data.
0019     n = 100;
0020     A = randn(n) + 1i*randn(n);
0021     A = .5*(A+A');
0022     
0023     % Create the manifold
0024     S = spherecomplexfactory(n);
0025     P = powermanifold(S,1); % cell
0026     X.x = S;
0027     X.y = P;
0028     problem.M = productmanifold(X); % struct
0029     
0030     % For Matlab R2021b or later, define the problem cost function as usual
0031     % problem.cost  = @(X) -real(X.x'*A*X.y{1});
0032     
0033     % For Matlab R2021a or earlier, translate the cost function into a
0034     % particular format with the basic functions in /functions_AD
0035     problem.cost  = @(X) -creal(cprod(cprod(ctransp(X.x), A), X.y{1}));
0036 
0037     % Define the gradient and the hessian via automatic differentiation
0038     problem = manoptAD(problem);
0039 
0040     % Numerically check gradient and Hessian consistency.
0041     figure;
0042     checkgradient(problem);
0043     figure;
0044     checkhessian(problem);
0045     
0046     % Solve.
0047     [x, xcost, info] = trustregions(problem);          %#ok<ASGLU>
0048     
0049     % Test
0050     ground_truth = svd(A);
0051     distance = abs(ground_truth(1) - (-problem.cost(x)));
0052     fprintf('The distance between the ground truth and the solution is %e \n',distance);
0053 
0054     
0055 end

Generated on Fri 30-Sep-2022 13:18:25 by m2html © 2005