Checks the consistency of the cost function and the gradient. function checkgradient(problem) function checkgradient(problem, x) function checkgradient(problem, x, d) checkgradient performs a numerical test to check that the gradient defined in the problem structure agrees up to first order with the cost function at some point x, along some direction d. The test is based on a truncated Taylor series (see online Manopt documentation). It is also tested that the gradient is indeed a tangent vector. Both x and d are optional and will be sampled at random if omitted. See also: checkdiff checkhessian

- StoreDB
- canGetCost Checks whether the cost function can be computed for a problem structure.
- canGetGradient Checks whether the gradient can be computed for a problem structure.
- canGetPartialGradient Checks whether the partial gradient can be computed for a given problem.
- getGradient Computes the gradient of the cost function at x.
- checkdiff Checks the consistency of the cost function and directional derivatives.

- doubly_stochastic_denoising Find a doubly stochastic matrix closest to a given matrix, in Frobenius norm.
- basic_example_AD A basic example that shows how to apply automatic differentiation to
- complex_example_AD A basic example that shows how to define the cost funtion for
- complextest_AD1 Test AD for a complex optimization problem on a product manifold (struct)
- complextest_AD2 Test AD for a complex optimization problem on a power manifold (cell)
- complextest_AD3 Test AD for a complex optimization problem on a manifold which is stored
- realtest_AD1 Test AD for a real optimization problem on a product manifold (struct)
- realtest_AD2 Test AD for a real optimization problem on a power manifold (cell)
- realtest_AD3 Test AD for a real optimization problem on a manifold which is stored in
- linearsystem_compare Example code for the algorithms described in

0001 function checkgradient(problem, x, d) 0002 % Checks the consistency of the cost function and the gradient. 0003 % 0004 % function checkgradient(problem) 0005 % function checkgradient(problem, x) 0006 % function checkgradient(problem, x, d) 0007 % 0008 % checkgradient performs a numerical test to check that the gradient 0009 % defined in the problem structure agrees up to first order with the cost 0010 % function at some point x, along some direction d. The test is based on a 0011 % truncated Taylor series (see online Manopt documentation). 0012 % 0013 % It is also tested that the gradient is indeed a tangent vector. 0014 % 0015 % Both x and d are optional and will be sampled at random if omitted. 0016 % 0017 % See also: checkdiff checkhessian 0018 0019 % This file is part of Manopt: www.manopt.org. 0020 % Original author: Nicolas Boumal, Dec. 30, 2012. 0021 % Contributors: 0022 % Change log: 0023 % 0024 % April 3, 2015 (NB): 0025 % Works with the new StoreDB class system. 0026 % 0027 % Nov. 1, 2016 (NB): 0028 % Now calls checkdiff with force_gradient = true, instead of doing an 0029 % rmfield of problem.diff. This became necessary after getGradient 0030 % was updated to know how to compute the gradient from directional 0031 % derivatives. 0032 0033 0034 % Verify that the problem description is sufficient. 0035 if ~canGetCost(problem) 0036 % The call to canGetPartialGradient will readily issue a warning if 0037 % problem.ncostterms is not defined even though it is expected. 0038 if ~canGetPartialGradient(problem) 0039 error('getCost:checkgradient', 'It seems no cost was provided.'); 0040 else 0041 error('getCost:stochastic', ['It seems no cost was provided.\n' ... 0042 'If you intend to use a stochastic solver, you still\n' ... 0043 'need to define problem.cost to use checkgradient.']); 0044 end 0045 end 0046 if ~canGetGradient(problem) 0047 warning('manopt:checkgradient:nograd', ... 0048 'It seems no gradient was provided.'); 0049 end 0050 0051 x_isprovided = exist('x', 'var') && ~isempty(x); 0052 d_isprovided = exist('d', 'var') && ~isempty(d); 0053 0054 if ~x_isprovided && d_isprovided 0055 error('If d is provided, x must be too, since d is tangent at x.'); 0056 end 0057 0058 % If x and / or d are not specified, pick them at random. 0059 if ~x_isprovided 0060 x = problem.M.rand(); 0061 end 0062 if ~d_isprovided 0063 d = problem.M.randvec(x); 0064 end 0065 0066 %% Check that the gradient yields a first order model of the cost. 0067 0068 % Call checkdiff with force_gradient set to true, to force that 0069 % function to make a gradient call. 0070 checkdiff(problem, x, d, true); 0071 title(sprintf(['Gradient check.\nThe slope of the continuous line ' ... 0072 'should match that of the dashed\n(reference) line ' ... 0073 'over at least a few orders of magnitude for h.'])); 0074 xlabel('h'); 0075 ylabel('Approximation error'); 0076 0077 %% Try to check that the gradient is a tangent vector. 0078 if isfield(problem.M, 'tangent') 0079 storedb = StoreDB(); 0080 key = storedb.getNewKey(); 0081 grad = getGradient(problem, x, storedb, key); 0082 pgrad = problem.M.tangent(x, grad); 0083 residual = problem.M.lincomb(x, 1, grad, -1, pgrad); 0084 err = problem.M.norm(x, residual); 0085 fprintf('The residual should be 0, or very close. Residual: %g.\n', err); 0086 fprintf('If it is far from 0, then the gradient is not in the tangent space.\n'); 0087 fprintf('In certain cases (e.g., hyperbolicfactory), the tangency test is inconclusive.\n'); 0088 else 0089 fprintf(['Unfortunately, Manopt was unable to verify that the '... 0090 'gradient is indeed a tangent vector.\nPlease verify ' ... 0091 'this manually or implement the ''tangent'' function ' ... 0092 'in your manifold structure.']); 0093 end 0094 0095 end

Generated on Fri 30-Sep-2022 13:18:25 by