Home > examples > doubly_stochastic_denoising.m

doubly_stochastic_denoising

PURPOSE ^

Find a doubly stochastic matrix closest to a given matrix, in Frobenius norm.

SYNOPSIS ^

function doubly_stochastic_denoising()

DESCRIPTION ^

 Find a doubly stochastic matrix closest to a given matrix, in Frobenius norm.

 This example demonstrates how to use the geometry factories for the
 doubly stochastic multinomial manifold:
  multinomialdoublystochasticfactory and
  multinomialsymmetricfactory (for the symmetric case.)
 
 The file is based on developments in the research paper
 A. Douik and B. Hassibi, "Manifold Optimization Over the Set 
 of Doubly Stochastic Matrices: A Second-Order Geometry"
 ArXiv:1802.02628, 2018.

 Link to the paper: https://arxiv.org/abs/1802.02628.

 Please cite the Manopt paper as well as the research paper:
 @Techreport{Douik2018Manifold,
   Title   = {Manifold Optimization Over the Set of Doubly Stochastic 
              Matrices: {A} Second-Order Geometry},
   Author  = {Douik, A. and Hassibi, B.},
   Journal = {Arxiv preprint ArXiv:1802.02628},
   Year    = {2018}
 }
 
 This can be a starting point for many optimization problems of the form:

 minimize f(X) such that X is a doubly stochastic matrix (symmetric or not)

 Input:  None. This example file generates random data.
 
 Output: None.

 This file is part of Manopt: www.manopt.org.
 Original author: Ahmed Douik, March 15, 2018.
 Contributors:
 Change log:

CROSS-REFERENCE INFORMATION ^

This function calls: This function is called by:

SOURCE CODE ^

0001 function doubly_stochastic_denoising()
0002 % Find a doubly stochastic matrix closest to a given matrix, in Frobenius norm.
0003 %
0004 % This example demonstrates how to use the geometry factories for the
0005 % doubly stochastic multinomial manifold:
0006 %  multinomialdoublystochasticfactory and
0007 %  multinomialsymmetricfactory (for the symmetric case.)
0008 %
0009 % The file is based on developments in the research paper
0010 % A. Douik and B. Hassibi, "Manifold Optimization Over the Set
0011 % of Doubly Stochastic Matrices: A Second-Order Geometry"
0012 % ArXiv:1802.02628, 2018.
0013 %
0014 % Link to the paper: https://arxiv.org/abs/1802.02628.
0015 %
0016 % Please cite the Manopt paper as well as the research paper:
0017 % @Techreport{Douik2018Manifold,
0018 %   Title   = {Manifold Optimization Over the Set of Doubly Stochastic
0019 %              Matrices: {A} Second-Order Geometry},
0020 %   Author  = {Douik, A. and Hassibi, B.},
0021 %   Journal = {Arxiv preprint ArXiv:1802.02628},
0022 %   Year    = {2018}
0023 % }
0024 %
0025 % This can be a starting point for many optimization problems of the form:
0026 %
0027 % minimize f(X) such that X is a doubly stochastic matrix (symmetric or not)
0028 %
0029 % Input:  None. This example file generates random data.
0030 %
0031 % Output: None.
0032 %
0033 % This file is part of Manopt: www.manopt.org.
0034 % Original author: Ahmed Douik, March 15, 2018.
0035 % Contributors:
0036 % Change log:
0037     
0038     %% Generate input data
0039     n = 100;
0040     sigma = 1/n^2;
0041     % Generate a doubly stochastic matrix using the Sinkhorn algorithm
0042     B = doubly_stochastic(abs(randn(n, n))); 
0043     % Add noise to the matrix
0044     A = max(B + sigma*randn(n, n), 0.01);
0045 
0046     %% Setup an optimization problem for denoising
0047     
0048     % Manifold initialization: pick the symmetric or non-symmetric case
0049     symmetric_case = true;
0050     if ~symmetric_case
0051         manifold = multinomialdoublystochasticfactory(n);
0052     else
0053         % Input must also be symmetric (otherwhise, gradient must be adapted.)
0054         A = (A+A')/2;
0055         manifold = multinomialsymmetricfactory(n);
0056     end
0057     problem.M = manifold;
0058     
0059     % Specify cost function and derivatives
0060     problem.cost = @(X) 0.5*norm(A-X, 'fro')^2;
0061     problem.egrad = @(X) X - A;  % Euclidean gradient
0062     problem.ehess = @(X, U) U;   % Euclidean Hessian
0063 
0064     % Check consistency of the gradient and the Hessian. Useful if you
0065     % adapt this example for a new cost function and you would like to make
0066     % sure there is no mistake. These checks are optional of course.
0067     warning('off', 'manopt:multinomialdoublystochasticfactory:exp');
0068     warning('off', 'manopt:multinomialsymmetricfactory:exp');
0069     figure();
0070     checkgradient(problem); % Check the gradient
0071     figure();
0072     checkhessian(problem);  % Check the hessian. This test usually fails
0073                             % due to the use of a first order retraction,
0074                             % unless the test is performed at a critical point.
0075     
0076     %% Solve
0077     % A first order algorithm
0078     % Minimize the cost function using the Conjugate Gradients algorithm.
0079     [X1, ~, info, ~] = conjugategradient(problem); 
0080     S1 = [info.gradnorm] ; % Collecting the Gradient information
0081     
0082     % Computing the distance between the original, noiseless matrix and the
0083     % found solution
0084     fprintf('||X-B||_F^2 = %g\n', norm(X1 - B, 'fro')^2);
0085 
0086     % A second order algorithm
0087     % Minimize the cost function using the trust-regions method.
0088     [X2, ~, info, ~] = trustregions(problem);                                     
0089     S2 = [info.gradnorm] ; % Collecting the Gradient information
0090 
0091     % Computing the distance between the original, noiseless matrix and the
0092     % found solution
0093     fprintf('||X-B||_F^2 = %g\n', norm(X2 - B, 'fro')^2);
0094     
0095     
0096     %% Display
0097     figure() ;
0098     semilogy(S1, 'red', 'Marker', '*', 'LineWidth', 2, ...
0099                  'DisplayName', 'Conjugate Gradient Algorithm');
0100     hold on ;
0101     semilogy(S2, 'blue', 'Marker', '+', 'LineWidth', 2, ...
0102                  'DisplayName', 'Trust Regions Method');
0103     
0104     set(gca, 'YGrid', 'on', 'XGrid', 'on');
0105     xlabel('Number of Iteration', 'FontSize', 16);
0106     ylabel('Norm of Gradient', 'FontSize', 16);
0107     legend1 = legend('show');
0108     set(legend1, 'FontSize', 16);
0109 
0110     % This Hessian test is performed at a computed solution, which is
0111     % expected to be a critical point. Thus we expect the slope test to
0112     % succeed. It could fail if the solution X1 has entries very close to
0113     % zero, so that numerical issues might come up. This is because de
0114     % Fisher metric on the multinomial manifold involves division by the
0115     % entries of X.
0116     figure();
0117     checkhessian(problem, X1);
0118     
0119 end

Generated on Tue 19-May-2020 18:46:12 by m2html © 2005