Checks the consistency of the cost function and the gradient.
function checkgradient(problem, x, d)
This function calls:
This function is called by:
- canGetCost Checks whether the cost function can be computed for a problem structure.
- canGetGradient Checks whether the gradient can be computed for a problem structure.
- canGetPartialGradient Checks whether the partial gradient can be computed for a given problem.
- getGradient Computes the gradient of the cost function at x.
- checkdiff Checks the consistency of the cost function and directional derivatives.
- doubly_stochastic_denoising Find a doubly stochastic matrix closest to a given matrix, in Frobenius norm.
- basic_example_AD A basic example that shows how to apply automatic differentiation to
- complex_example_AD A basic example that shows how to define the cost funtion for
- complextest_AD1 Test AD for a complex optimization problem on a product manifold (struct)
- complextest_AD2 Test AD for a complex optimization problem on a power manifold (cell)
- complextest_AD3 Test AD for a complex optimization problem on a manifold which is stored
- realtest_AD1 Test AD for a real optimization problem on a product manifold (struct)
- realtest_AD2 Test AD for a real optimization problem on a power manifold (cell)
- realtest_AD3 Test AD for a real optimization problem on a manifold which is stored in
- linearsystem_compare Example code for the algorithms described in
0001 function checkgradient(problem, x, d)
0035 if ~canGetCost(problem)
0038 if ~canGetPartialGradient(problem)
0039 error('getCost:checkgradient', 'It seems no cost was provided.');
0041 error('getCost:stochastic', ['It seems no cost was provided.\n' ...
0042 'If you intend to use a stochastic solver, you still\n' ...
0043 'need to define problem.cost to use checkgradient.']);
0046 if ~canGetGradient(problem)
0047 warning('manopt:checkgradient:nograd', ...
0048 'It seems no gradient was provided.');
0051 x_isprovided = exist('x', 'var') && ~isempty(x);
0052 d_isprovided = exist('d', 'var') && ~isempty(d);
0054 if ~x_isprovided && d_isprovided
0055 error('If d is provided, x must be too, since d is tangent at x.');
0059 if ~x_isprovided
0060 x = problem.M.rand();
0062 if ~d_isprovided
0063 d = problem.M.randvec(x);
0070 checkdiff(problem, x, d, true);
0071 title(sprintf(['Gradient check.\nThe slope of the continuous line ' ...
0072 'should match that of the dashed\n(reference) line ' ...
0073 'over at least a few orders of magnitude for h.']));
0075 ylabel('Approximation error');
0078 if isfield(problem.M, 'tangent')
0079 storedb = StoreDB();
0080 key = storedb.getNewKey();
0081 grad = getGradient(problem, x, storedb, key);
0082 pgrad = problem.M.tangent(x, grad);
0083 residual = problem.M.lincomb(x, 1, grad, -1, pgrad);
0084 err = problem.M.norm(x, residual);
0085 fprintf('The residual should be 0, or very close. Residual: %g.\n', err);
0086 fprintf('If it is far from 0, then the gradient is not in the tangent space.\n');
0087 fprintf('In certain cases (e.g., hyperbolicfactory), the tangency test is inconclusive.\n');
0089 fprintf(['Unfortunately, Manopt was unable to verify that the '...
0090 'gradient is indeed a tangent vector.\nPlease verify ' ...
0091 'this manually or implement the ''tangent'' function ' ...
0092 'in your manifold structure.']);
Generated on Sun 05-Sep-2021 17:57:00 by m2html © 2005