]> gitweb.michael.orlitzky.com - octave.git/blob - optimization/step_length_cgm.m
Replace ##-style comments with %-style comments in all non-test code.
[octave.git] / optimization / step_length_cgm.m
1 function alpha = step_length_cgm(r, A, p)
2 %
3 % Compute the step length for the conjugate gradient method (CGM).
4 % The CGM attempts to solve,
5 %
6 % Ax = b
7 %
8 % or equivalently,
9 %
10 % min[phi(x) = (1/2)<Ax,x> - <b,x>]
11 %
12 % where ``A`` is positive-definite. In the process, we need to
13 % compute a number of search directions ``p`` and optimal step
14 % lengths ``alpha``; i.e.,
15 %
16 % x_{k+1} = x_{k} + alpha_{k}*p_{k}
17 %
18 % This function computes alpha_{k} in the formula above.
19 %
20 % INPUT:
21 %
22 % - ``r`` -- The residual, Ax - b, at the current step.
23 %
24 % - ``A`` -- The matrix ``A`` in the formulation above.
25 %
26 % - ``p`` -- The current search direction.
27 %
28 % OUTPUT:
29 %
30 % - ``alpha`` -- The minimizer of ``f(x) = x + alpha*p`` along ``p`.
31 %
32 % NOTES:
33 %
34 % All vectors are assumed to be *column* vectors.
35 %
36
37 % A simple calculation should convince you that the gradient of
38 % phi(x) above is Ax - b == r.
39 alpha = step_length_positive_definite(r, A, p);
40 end