1 function [x, k] = preconditioned_conjugate_gradient_method(Q,
14 % min [phi(x) = (1/2)*<Qx,x> + <b,x>]
16 % using the preconditioned conjugate gradient method (14.54 in
21 % - ``Q`` -- The coefficient matrix of the system to solve. Must
22 % be positive definite.
24 % - ``M`` -- The preconditioning matrix. If the actual matrix used
25 % to precondition ``Q`` is called ``C``, i.e. ``C^(-1) * Q *
26 % C^(-T) == \bar{Q}``, then M=CC^T.
28 % - ``b`` -- The right-hand-side of the system to solve.
30 % - ``x0`` -- The starting point for the search.
32 % - ``tolerance`` -- How close ``Qx`` has to be to ``b`` (in
33 % magnitude) before we stop.
35 % - ``max_iterations`` -- The maximum number of iterations to
40 % - ``x`` - The solution to Qx=b.
42 % - ``k`` - The ending value of k; that is, the number of
43 % iterations that were performed.
47 % All vectors are assumed to be *column* vectors.
51 % 1. Guler, Osman. Foundations of Optimization. New York, Springer,
60 Q_bar = C_inv * Q * Ct_inv;
63 % The solution to Q_bar*x_bar == b_bar is x_bar = Ct*x.
64 [x_bar, k] = conjugate_gradient_method(Q_bar, b_bar, x0, tolerance, max_iterations);