Home > manopt > autodiff > basic_examples_AD > basic_example_AD.m

basic_example_AD

PURPOSE ^

A basic example that shows how to apply automatic differentiation to

SYNOPSIS ^

function basic_example_AD()

DESCRIPTION ^

 A basic example that shows how to apply automatic differentiation to
 computing the gradient and the hessian.

 Note: Computation of the hessian is not available via AD for Matlab 
 version R2020b or earlier. To fully exploit the convenience of AD,
 please update to R2021b or later if possible.
 

 See also: manoptAD, manoptADhelp

CROSS-REFERENCE INFORMATION ^

This function calls: This function is called by:

SOURCE CODE ^

0001 function basic_example_AD()
0002 % A basic example that shows how to apply automatic differentiation to
0003 % computing the gradient and the hessian.
0004 %
0005 % Note: Computation of the hessian is not available via AD for Matlab
0006 % version R2020b or earlier. To fully exploit the convenience of AD,
0007 % please update to R2021b or later if possible.
0008 %
0009 %
0010 % See also: manoptAD, manoptADhelp
0011 
0012 % This file is part of Manopt and is copyrighted. See the license file.
0013 %
0014 % Main author: Xiaowen Jiang, August, 31, 2021
0015 % Contributors: Nicolas Boumal
0016 % Change log:
0017 %
0018 
0019     % Verify that Manopt was indeed added to the Matlab path.
0020     if isempty(which('spherefactory'))
0021         error(['You should first add Manopt to the Matlab path.\n' ...
0022                'Please run importmanopt.']);
0023     end
0024     
0025     % Verify that the deep learning tool box was installed
0026     assert(exist('dlarray', 'file') == 2, ['Deep learning tool box is '... 
0027     'needed for automatic differentiation.\n Please install the'...
0028     'latest version of the deep learning tool box and \nupgrade to Matlab'...
0029     ' R2021b if possible.'])
0030     
0031     % Generate the problem data.
0032     n = 100;
0033     A = randn(n);
0034     A = .5*(A+A');
0035     
0036     % Create the problem structure.
0037     manifold = spherefactory(n);
0038     problem.M = manifold;
0039     
0040     % Define the problem cost function
0041     problem.cost  = @(x) -x'*(A*x);
0042     
0043     % Provide the gradient and the hessian via automatic differentiation
0044     problem = manoptAD(problem);
0045     
0046     % If the egrad has already been provided, the ehess will be computed
0047     % according to the egrad, which maybe faster based on the expression
0048     % of the egrad.
0049     % problem.egrad = @(x) -2*(A*x);
0050     % problem = manoptAD(problem);
0051 
0052     % If the user only wants the gradient or the hessian information,
0053     % set the second argument of manoptAD to be 'egrad' or 'ehess'
0054 
0055     % e.g. Provide the gradient only and use FD approximation of hessian
0056     % (which is often faster than providing the exact hessian).
0057     % problem = manoptAD(problem,'egrad');
0058 
0059     % Numerically check gradient and Hessian consistency.
0060     figure;
0061     checkgradient(problem);
0062     figure;
0063     checkhessian(problem);
0064     
0065     % Solve.
0066     [x, xcost, info] = trustregions(problem);          %#ok<ASGLU>
0067     
0068     % Display some statistics.
0069     figure;
0070     semilogy([info.iter], [info.gradnorm], '.-');
0071     xlabel('Iteration #');
0072     ylabel('Gradient norm');
0073     title('Convergence of the trust-regions algorithm on the sphere');
0074     
0075 end

Generated on Fri 30-Sep-2022 13:18:25 by m2html © 2005