Choosing the Value of <IMG ALIGN=BOTTOM SRC="_22900_tex2html_wrap4973.gif">



next up previous contents index
Next: The Symmetric Successive Up: The Successive Overrelaxation Previous: The Successive Overrelaxation

Choosing the Value of

   

If , the SOR method simplifies to the Gauss-Seidel method. A theorem due to Kahan [130] shows that SOR fails to converge if is outside the interval . Though technically the term underrelaxation  should be used when , for convenience the term overrelaxation  is now used for any value of .

In general, it is not possible to compute in advance the value of that is optimal with respect to the rate of convergence of SOR. Even when it is possible to compute the optimal value for , the expense of such computation is usually prohibitive. Frequently, some heuristic estimate is used, such as where is the mesh spacing of the discretization of the underlying physical domain.

If the coefficient matrix is symmetric and positive definite, the SOR iteration is guaranteed to converge for any value of between 0 and 2, though the choice of can significantly affect the rate at which the SOR iteration converges. Sophisticated implementations of the SOR algorithm (such as that found in ITPACK   [140]) employ adaptive parameter estimation schemes to try to home in on the appropriate value of by estimating the rate at which the iteration is converging.

Adaptive methods: Iterative methods that collect information about the coefficient matrix during the iteration process, and use this to speed up convergence. Symmetric matrix: See: Matrix properties. Diagonally dominant matrix: See: Matrix properties -Matrix: See: Matrix properties. Positive definite matrix: See: Matrix properties. Matrix properties: We call a square matrix

Symmetric
if for all , .
Positive definite
if it satisfies for all nonzero vectors .
Diagonally dominant
if .
An -matrix
if for , and it is nonsingular with for all , .

For coefficient matrices of a special class called consistently ordered with property A (see Young [217]), which includes certain orderings of matrices arising from the discretization of elliptic PDEs, there is a direct relationship between the spectra of the Jacobi and SOR iteration matrices. In principle, given the spectral radius of the Jacobi iteration matrix, one can determine a priori the theoretically optimal value of for SOR:

 

This is seldom done, since calculating the spectral radius of the Jacobi matrix requires an impractical amount of computation. However, relatively inexpensive rough estimates of (for example, from the power method, see Golub and Van Loan [p. 351]GoVL:matcomp) can yield reasonable estimates for the optimal value of .    



next up previous contents index
Next: The Symmetric Successive Up: The Successive Overrelaxation Previous: The Successive Overrelaxation



Jack Dongarra
Mon Nov 20 08:52:54 EST 1995