Merges two options structures with one having precedence over the other. function opts = mergeOptions(opts1, opts2) input: opts1 and opts2 are two structures. output: opts is a structure containing all fields of opts1 and opts2. Whenever a field is present in both opts1 and opts2, it is the value in opts2 that is kept. The typical usage is to have opts1 contain default options and opts2 contain user-specified options that overwrite the defaults. See also: getGlobalDefaults

- low_rank_dist_completion Perform low-rank distance matrix completion w/ automatic rank detection.
- arc Adaptive regularization by cubics (ARC) minimization algorithm for Manopt
- arc_conjugate_gradient Subproblem solver for ARC based on a nonlinear conjugate gradient method.
- arc_gradient_descent Subproblem solver for ARC based on gradient descent.
- arc_lanczos Subproblem solver for ARC based on a Lanczos process.
- barzilaiborwein Riemannian Barzilai-Borwein solver with non-monotone line-search.
- rlbfgs Riemannian limited memory BFGS solver for smooth objective functions.
- conjugategradient Conjugate gradient minimization algorithm for Manopt.
- approxgradientFD Gradient approx. fnctn handle based on finite differences of the cost.
- approxhessianFD Hessian approx. fnctn handle based on finite differences of the gradient.
- linesearch Standard line-search algorithm (step size selection) for descent methods.
- linesearch_adaptive Adaptive line search algorithm (step size selection) for descent methods.
- linesearch_decrease Backtracking line-search aiming merely for a decrease in cost value.
- linesearch_hint Armijo line-search based on the line-search hint in the problem structure.
- neldermead Nelder Mead optimization algorithm for derivative-free minimization.
- preconhessiansolve Preconditioner based on the inverse Hessian, by solving linear systems.
- pso Particle swarm optimization (PSO) for derivative-free minimization.
- steepestdescent Steepest descent (gradient descent) minimization algorithm for Manopt.
- stepsize_sg Standard step size selection algorithm for the stochastic gradient method
- stochasticgradient Stochastic gradient (SG) minimization algorithm for Manopt.
- trs_gep Solves trust-region subproblem with TRSgep in a subspace of tangent space.
- trs_tCG Truncated (Steihaug-Toint) Conjugate-Gradient method.
- trs_tCG_cached Truncated (Steihaug-Toint) Conjugate-Gradient method with caching.
- trustregions Riemannian trust-regions solver for optimization on manifolds.
- hessianextreme Compute an extreme eigenvector / eigenvalue of the Hessian of a problem.
- manoptsolve Gateway helper function to call a Manopt solver, chosen in the options.

0001 function opts = mergeOptions(opts_sub, opts_master) 0002 % Merges two options structures with one having precedence over the other. 0003 % 0004 % function opts = mergeOptions(opts1, opts2) 0005 % 0006 % input: opts1 and opts2 are two structures. 0007 % output: opts is a structure containing all fields of opts1 and opts2. 0008 % Whenever a field is present in both opts1 and opts2, it is the value in 0009 % opts2 that is kept. 0010 % 0011 % The typical usage is to have opts1 contain default options and opts2 0012 % contain user-specified options that overwrite the defaults. 0013 % 0014 % See also: getGlobalDefaults 0015 0016 % This file is part of Manopt: www.manopt.org. 0017 % Original author: Nicolas Boumal, Dec. 30, 2012. 0018 % Contributors: 0019 % Change log: 0020 0021 0022 if isempty(opts_sub) 0023 opts_sub = struct(); 0024 end 0025 if isempty(opts_master) 0026 opts_master = struct(); 0027 end 0028 0029 opts = opts_sub; 0030 fields = fieldnames(opts_master); 0031 for i = 1 : length(fields) 0032 opts.(fields{i}) = opts_master.(fields{i}); 0033 end 0034 0035 end

Generated on Fri 30-Sep-2022 13:18:25 by