Home > manopt > solvers > linesearch > linesearch.m

linesearch

PURPOSE ^

Standard line-search algorithm (step size selection) for descent methods.

SYNOPSIS ^

function [stepsize, newx, newkey, lsstats] =linesearch(problem, x, d, f0, df0, options, storedb, key)

DESCRIPTION ^

 Standard line-search algorithm (step size selection) for descent methods.

 function [stepsize, newx, newkey, lsstats] = 
                 linesearch(problem, x, d, f0, df0, options, storedb, key)

 Base line-search algorithm for descent methods, based on a simple
 backtracking method. The search direction provided has to be a descent
 direction, as indicated by a negative df0 = directional derivative of f
 at x along d.

 The algorithm is invariant under positive scaling of the cost function
 and under offsetting, that is: if the cost function f is replaced by
 8*f+3 for example, the returned step size will be the same. Furthermore,
 the returned step size is independent of the norm of the search direction
 vector d: only the direction of d is important.
 
 Below, the step is constructed as alpha*d, and the step size is the norm
 of that vector, thus: stepsize = alpha*norm_d. The step is executed by
 retracting the vector alpha*d from the current point x, giving newx.

 This line-search may create and maintain a structure called lsmem inside
 storedb.internal. This gives the linesearch the opportunity to remember
 what happened in the previous calls. This is typically used to make a
 first guess at the step size, based on previous events.

 Inputs

  problem : structure holding the description of the optimization problem
  x : current point on the manifold problem.M
  d : tangent vector at x (descent direction) -- its norm is irrelevant
  f0 : cost value at x
  df0 : directional derivative at x along d
  options : options structure (see in code for usage)
  storedb : StoreDB object (handle class: passed by reference) for caching
  key : key associated to point x in storedb

  options, storedb and key are optional.

 Outputs

  stepsize : norm of the vector retracted to reach newx from x.
  newx : next iterate suggested by the line-search algorithm, such that
         the retraction at x of the vector alpha*d reaches newx.
  newkey : key associated to newx in storedb
  lsstats : statistics about the line-search procedure
            (stepsize, number of trials etc).

 See also: steepestdescent conjugategradients linesearch_adaptive

CROSS-REFERENCE INFORMATION ^

This function calls: This function is called by:

SOURCE CODE ^

0001 function [stepsize, newx, newkey, lsstats] = ...
0002                   linesearch(problem, x, d, f0, df0, options, storedb, key)
0003 % Standard line-search algorithm (step size selection) for descent methods.
0004 %
0005 % function [stepsize, newx, newkey, lsstats] =
0006 %                 linesearch(problem, x, d, f0, df0, options, storedb, key)
0007 %
0008 % Base line-search algorithm for descent methods, based on a simple
0009 % backtracking method. The search direction provided has to be a descent
0010 % direction, as indicated by a negative df0 = directional derivative of f
0011 % at x along d.
0012 %
0013 % The algorithm is invariant under positive scaling of the cost function
0014 % and under offsetting, that is: if the cost function f is replaced by
0015 % 8*f+3 for example, the returned step size will be the same. Furthermore,
0016 % the returned step size is independent of the norm of the search direction
0017 % vector d: only the direction of d is important.
0018 %
0019 % Below, the step is constructed as alpha*d, and the step size is the norm
0020 % of that vector, thus: stepsize = alpha*norm_d. The step is executed by
0021 % retracting the vector alpha*d from the current point x, giving newx.
0022 %
0023 % This line-search may create and maintain a structure called lsmem inside
0024 % storedb.internal. This gives the linesearch the opportunity to remember
0025 % what happened in the previous calls. This is typically used to make a
0026 % first guess at the step size, based on previous events.
0027 %
0028 % Inputs
0029 %
0030 %  problem : structure holding the description of the optimization problem
0031 %  x : current point on the manifold problem.M
0032 %  d : tangent vector at x (descent direction) -- its norm is irrelevant
0033 %  f0 : cost value at x
0034 %  df0 : directional derivative at x along d
0035 %  options : options structure (see in code for usage)
0036 %  storedb : StoreDB object (handle class: passed by reference) for caching
0037 %  key : key associated to point x in storedb
0038 %
0039 %  options, storedb and key are optional.
0040 %
0041 % Outputs
0042 %
0043 %  stepsize : norm of the vector retracted to reach newx from x.
0044 %  newx : next iterate suggested by the line-search algorithm, such that
0045 %         the retraction at x of the vector alpha*d reaches newx.
0046 %  newkey : key associated to newx in storedb
0047 %  lsstats : statistics about the line-search procedure
0048 %            (stepsize, number of trials etc).
0049 %
0050 % See also: steepestdescent conjugategradients linesearch_adaptive
0051 
0052 % This file is part of Manopt: www.manopt.org.
0053 % Original author: Nicolas Boumal, Dec. 30, 2012.
0054 % Contributors:
0055 % Change log:
0056 %
0057 %   Sept. 13, 2013 (NB):
0058 %       User control over the parameters of the linesearch via the options
0059 %       ls_contraction_factor, ls_optimism, ls_suff_decr and ls_max_steps.
0060 %       See in code for the effect of those.
0061 %
0062 %   Sept. 13, 2013 (NB):
0063 %       The automatic direction reversal feature was removed (it triggered
0064 %       when df0 > 0). Direction reversal is a decision that needs to be
0065 %       made by the solver, so it can know about it.
0066 %
0067 %   Sept. 13, 2013 (NB):
0068 %       The linesearch is now invariant under rescaling of the cost
0069 %       function f. That is, if f is replaced by 8*f (and hence the
0070 %       directional derivatives of f are scaled accordingly), the
0071 %       computed stepsizes will not change.
0072 %
0073 %   Nov. 7, 2013 (NB):
0074 %       The linesearch is now invariant under rescaling of the search
0075 %       direction d. The meaning of stepsize is also more clear in the
0076 %       comments. Added a parameter ls_initial_stepsize to give users
0077 %       control over the first step size trial.
0078 %
0079 %   April 3, 2015 (NB):
0080 %       Works with the new StoreDB class system.
0081 %
0082 %   April 8, 2015 (NB):
0083 %       Got rid of lsmem input/output: now maintained in storedb.internal.
0084 %
0085 %   Oct. 7, 2016 (NB):
0086 %       Thanks to Wen Huang, a bug was corrected in the logic around
0087 %       lsmem handling. Specifically, lsmem = storedb.internal.lsmem;
0088 %       was erroneously coded as lsmem = storedb.internal;
0089 %
0090 %   Aug. 2, 2018 (NB):
0091 %       Now using storedb.remove() to keep the cache lean.
0092 
0093 
0094     % Allow omission of the key, and even of storedb.
0095     if ~exist('key', 'var')
0096         if ~exist('storedb', 'var')
0097             storedb = StoreDB();
0098         end
0099         key = storedb.getNewKey();
0100     end
0101 
0102     % Backtracking default parameters. These can be overwritten in the
0103     % options structure which is passed to the solver.
0104     default_options.ls_contraction_factor = .5;
0105     default_options.ls_optimism = 1/.5;
0106     default_options.ls_suff_decr = 2^-13;
0107     default_options.ls_max_steps = 25;
0108     default_options.ls_initial_stepsize = 1;
0109     
0110     if ~exist('options', 'var') || isempty(options)
0111         options = struct();
0112     end
0113     options = mergeOptions(default_options, options);
0114     
0115     contraction_factor = options.ls_contraction_factor;
0116     optimism = options.ls_optimism;
0117     suff_decr = options.ls_suff_decr;
0118     max_ls_steps = options.ls_max_steps;
0119     initial_stepsize = options.ls_initial_stepsize;
0120     
0121     % Compute the norm of the search direction.
0122     % This is useful to make the linesearch algorithm invariant under the
0123     % scaling of d. The rationale is that the important information is the
0124     % search direction, not the size of that vector. The question of how
0125     % far we should go is precisely what the linesearch algorithm is
0126     % supposed to answer: the calling algorithm should not need to care.
0127     norm_d = problem.M.norm(x, d);
0128     
0129     % At first, we have no idea of what the step size should be.
0130     alpha = NaN;
0131     
0132     % If we know about what happened at the previous step, we can leverage
0133     % that to compute an initial guess for the step size, as inspired from
0134     % Nocedal&Wright, p59.
0135     if isfield(storedb.internal, 'lsmem')
0136         lsmem = storedb.internal.lsmem;
0137         if isfield(lsmem, 'f0')
0138             % Pick initial step size based on where we were last time,
0139             alpha = 2*(f0 - lsmem.f0) / df0;
0140             % and go look a little further (or less far), just in case.
0141             alpha = optimism*alpha;
0142         end
0143     end
0144     
0145     % If we have no information about the previous iteration (maybe this is
0146     % the first one?) or if the above formula gave a too small step size
0147     % (perhaps it is even negative), then fall back to a user supplied
0148     % suggestion for the first step size (the "a priori").
0149     % At any rate, the choice should be invariant under rescaling of the
0150     % cost function f and of the search direction d, and it should be
0151     % bounded away from zero for convergence guarantees. We must allow it
0152     % to be close to zero though, for fine convergence.
0153     if isnan(alpha) || alpha*norm_d <= eps
0154         alpha = initial_stepsize/norm_d;
0155     end
0156     
0157 
0158     % Make the chosen step and compute the cost there.
0159     newx = problem.M.retr(x, d, alpha);
0160     newkey = storedb.getNewKey();
0161     newf = getCost(problem, newx, storedb, newkey);
0162     cost_evaluations = 1;
0163     
0164     % Backtrack while the Armijo criterion is not satisfied
0165     while newf > f0 + suff_decr*alpha*df0
0166         
0167         % Reduce the step size,
0168         alpha = contraction_factor * alpha;
0169         
0170         % and look closer down the line.
0171         storedb.remove(newkey);              % we no longer need this cache
0172         newx = problem.M.retr(x, d, alpha);
0173         newkey = storedb.getNewKey();
0174         newf = getCost(problem, newx, storedb, newkey);
0175         cost_evaluations = cost_evaluations + 1;
0176         
0177         % Make sure we don't run out of budget.
0178         if cost_evaluations >= max_ls_steps
0179             break;
0180         end
0181         
0182     end
0183     
0184     % If we got here without obtaining a decrease, we reject the step.
0185     if newf > f0
0186         alpha = 0;
0187         newx = x;
0188         newkey = key;
0189         newf = f0; %#ok<NASGU>
0190     end
0191     
0192     % As seen outside this function, stepsize is the size of the vector we
0193     % retract to make the step from x to newx. Since the step is alpha*d:
0194     stepsize = alpha * norm_d;
0195 
0196     % Save the situation faced now so that, at the next iteration,
0197     % we will know something about the previous decision.
0198     storedb.internal.lsmem.f0 = f0;
0199     storedb.internal.lsmem.df0 = df0;
0200     storedb.internal.lsmem.stepsize = stepsize;
0201     
0202     % Return some statistics also, for possible analysis.
0203     lsstats.costevals = cost_evaluations;
0204     lsstats.stepsize = stepsize;
0205     lsstats.alpha = alpha;
0206     
0207 end

Generated on Fri 30-Sep-2022 13:18:25 by m2html © 2005