%
% OUTPUT:
%
- % - ``x`` - The solution to Qx=b.
+ % - ``x`` - The computed solution to Qx=b.
%
% - ``k`` - The ending value of k; that is, the number of
% iterations that were performed.
% Conjugate-Gradient Method", we are supposed to define
% d_{0} = -z_{0}, not -r_{0} as written.
%
+ % The rather verbose name of this function was chosen to avoid
+ % conflicts with other implementations.
+ %
% REFERENCES:
%
% 1. Guler, Osman. Foundations of Optimization. New York, Springer,
%
% Set k=0 first, that way the references to xk,rk,zk,dk which
- % immediately follow correspond to x0,r0,z0,d0 respectively.
+ % immediately follow correspond (semantically) to x0,r0,z0,d0.
k = 0;
xk = x0;
dk = -zk;
for k = [ 0 : max_iterations ]
+
if (norm(rk) < tolerance)
- x = xk;
- return;
+ % Check our stopping condition. This should catch the k=0 case.
+ x = xk;
+ return;
end
% Used twice, avoid recomputation.
% do them both, so we precompute the more expensive operation.
Qdk = Q * dk;
+ % After substituting the two previously-created variables, the
+ % following algorithm occurs verbatim in the reference.
alpha_k = rkzk/(dk' * Qdk);
x_next = xk + (alpha_k * dk);
r_next = rk + (alpha_k * Qdk);
dk = d_next;
end
+ % The algorithm didn't converge, but we still want to return the
+ % terminal value of xk.
x = xk;
end