function alpha = step_length_cgm(r, A, p)
- ##
- ## Compute the step length for the conjugate gradient method (CGM).
- ## The CGM attempts to solve,
- ##
- ## Ax = b
- ##
- ## or equivalently,
- ##
- ## min[phi(x) = (1/2)<Ax,x> - <b,x>]
- ##
- ## where ``A`` is positive-definite. In the process, we need to
- ## compute a number of search directions ``p`` and optimal step
- ## lengths ``alpha``; i.e.,
- ##
- ## x_{k+1} = x_{k} + alpha_{k}*p_{k}
- ##
- ## This function computes alpha_{k} in the formula above.
- ##
- ## INPUT:
- ##
- ## - ``r`` -- The residual, Ax - b, at the current step.
- ##
- ## - ``A`` -- The matrix ``A`` in the formulation above.
- ##
- ## - ``p`` -- The current search direction.
- ##
- ## OUTPUT:
- ##
- ## - ``alpha`` -- The minimizer of ``f(x) = x + alpha*p`` along ``p`.
- ##
- ## NOTES:
- ##
- ## All vectors are assumed to be *column* vectors.
- ##
+ %
+ % Compute the step length for the conjugate gradient method (CGM).
+ % The CGM attempts to solve,
+ %
+ % Ax = b
+ %
+ % or equivalently,
+ %
+ % min[phi(x) = (1/2)<Ax,x> - <b,x>]
+ %
+ % where ``A`` is positive-definite. In the process, we need to
+ % compute a number of search directions ``p`` and optimal step
+ % lengths ``alpha``; i.e.,
+ %
+ % x_{k+1} = x_{k} + alpha_{k}*p_{k}
+ %
+ % This function computes alpha_{k} in the formula above.
+ %
+ % INPUT:
+ %
+ % - ``r`` -- The residual, Ax - b, at the current step.
+ %
+ % - ``A`` -- The matrix ``A`` in the formulation above.
+ %
+ % - ``p`` -- The current search direction.
+ %
+ % OUTPUT:
+ %
+ % - ``alpha`` -- The minimizer of ``f(x) = x + alpha*p`` along ``p`.
+ %
+ % NOTES:
+ %
+ % All vectors are assumed to be *column* vectors.
+ %
- ## A simple calculation should convince you that the gradient of
- ## phi(x) above is Ax - b == r.
+ % A simple calculation should convince you that the gradient of
+ % phi(x) above is Ax - b == r.
alpha = step_length_positive_definite(r, A, p);
end