Home > examples > dominant_invariant_subspace_complex.m

dominant_invariant_subspace_complex

PURPOSE ^

Returns a unitary basis of the dominant invariant p-subspace of A.

SYNOPSIS ^

function [X, info] = dominant_invariant_subspace_complex(A, p)

DESCRIPTION ^

 Returns a unitary basis of the dominant invariant p-subspace of A.

 function X = dominant_invariant_subspace_complex(A, p)

 Input: A complex, Hermitian matrix A of size nxn and an integer p < n.
 Output: A complex, unitary matrix X of size nxp such that trace(X'*A*X)
         is maximized. That is, the columns of X form a unitary basis
         of a dominant subspace of dimension p of A.

 The optimization is performed on the complex Grassmann manifold, since
 only the space spanned by the columns of X matters.

 See dominant_invariant_subspace for more details in the real case.

 See also: dominant_invariant_subspace grassmanncomplexfactory

CROSS-REFERENCE INFORMATION ^

This function calls: This function is called by:

SOURCE CODE ^

0001 function [X, info] = dominant_invariant_subspace_complex(A, p)
0002 % Returns a unitary basis of the dominant invariant p-subspace of A.
0003 %
0004 % function X = dominant_invariant_subspace_complex(A, p)
0005 %
0006 % Input: A complex, Hermitian matrix A of size nxn and an integer p < n.
0007 % Output: A complex, unitary matrix X of size nxp such that trace(X'*A*X)
0008 %         is maximized. That is, the columns of X form a unitary basis
0009 %         of a dominant subspace of dimension p of A.
0010 %
0011 % The optimization is performed on the complex Grassmann manifold, since
0012 % only the space spanned by the columns of X matters.
0013 %
0014 % See dominant_invariant_subspace for more details in the real case.
0015 %
0016 % See also: dominant_invariant_subspace grassmanncomplexfactory
0017 
0018 % This file is part of Manopt and is copyrighted. See the license file.
0019 %
0020 % Main author: Nicolas Boumal, June 30, 2015
0021 % Contributors:
0022 %
0023 % Change log:
0024 %
0025 %   Xiaowen Jiang Aug. 31, 2021
0026 %       Added AD to compute the egrad and the ehess
0027 
0028 
0029     % Generate some random data to test the function
0030     if ~exist('A', 'var') || isempty(A)
0031         A = randn(128) + 1i*randn(128);
0032         A = (A+A')/2;
0033     end
0034     if ~exist('p', 'var') || isempty(p)
0035         p = 3;
0036     end
0037     
0038     % Make sure the input matrix is Hermitian
0039     n = size(A, 1);
0040     assert(size(A, 2) == n, 'A must be square.');
0041     assert(norm(A-A', 'fro') < n*eps, 'A must be Hermitian.');
0042     assert(p<=n, 'p must be smaller than n.');
0043     
0044     % Define the cost and its derivatives on the complex Grassmann manifold
0045     Gr = grassmanncomplexfactory(n, p);
0046     problem.M = Gr;
0047     problem.cost  = @(X)    -real(trace(X'*A*X));
0048     problem.egrad = @(X)    -2*A*X;
0049     problem.ehess = @(X, H) -2*A*H;
0050     
0051     % An alternative way to compute the egrad and the ehess is to use
0052     % automatic differentiation provided in the deep learning toolbox
0053     % (slower). AD does not support complex numbers if the Matlab version
0054     % is R2021a or earlier. The cost function should be defined differently
0055     % In this case. See complex_example_AD.m and manoptADhelp.m for more
0056     % information.
0057     % problem.cost = @cost_complex;
0058     %    function f = cost_complex(X)
0059     %        AX = cprod(A,X);
0060     %        Xtransp = ctransp(X);
0061     %        product = cprod(Xtransp,AX);
0062     %        f = -creal(ctrace(product));
0063     %    end
0064     % call manoptAD to automatically obtain the egrad and the ehess
0065     % problem = manoptAD(problem);
0066     
0067     % If the version of Matlab installed is R2021b or later, specify the
0068     % cost function in the normal way and call manoptAD. Notice that
0069     % the function trace is not supported for AD so far. Replace it with
0070     % ctrace described in the file manoptADhelp.m
0071     % problem.cost  = @(X)    -real(ctrace(X'*A*X));
0072     % problem = manoptAD(problem);
0073 
0074     % Execute some checks on the derivatives for early debugging.
0075     % These can be commented out.
0076     % checkgradient(problem);
0077     % pause;
0078     % checkhessian(problem);
0079     % pause;
0080     
0081     % Issue a call to a solver. A random initial guess will be chosen and
0082     % default options are selected except for the ones we specify here.
0083     options.Delta_bar = 8*sqrt(p);
0084     [X, costX, info, options] = trustregions(problem, [], options); %#ok<ASGLU>
0085     
0086     fprintf('Options used:\n');
0087     disp(options);
0088     
0089     % For our information, Manopt can also compute the spectrum of the
0090     % Riemannian Hessian on the tangent space at (any) X. Computing the
0091     % spectrum at the solution gives us some idea of the conditioning of
0092     % the problem. If we were to implement a preconditioner for the
0093     % Hessian, this would also inform us on its performance.
0094     %
0095     % Notice that (typically) all eigenvalues of the Hessian at the
0096     % solution are positive, i.e., we find an isolated minimizer. If we
0097     % replace the Grassmann manifold by the Stiefel manifold, hence still
0098     % optimizing over orthonormal matrices but ignoring the invariance
0099     % cost(XQ) = cost(X) for all Q orthogonal, then we see
0100     % dim O(p) = p(p-1)/2 zero eigenvalues in the Hessian spectrum, making
0101     % the optimizer not isolated anymore.
0102     if Gr.dim() < 512
0103         evs = hessianspectrum(problem, X);
0104         stairs(sort(evs));
0105         title(['Eigenvalues of the Hessian of the cost function ' ...
0106                'at the solution']);
0107         xlabel('Eigenvalue number (sorted)');
0108         ylabel('Value of the eigenvalue');
0109     end
0110 
0111 end

Generated on Fri 30-Sep-2022 13:18:25 by m2html © 2005