This chapter surveys a variety of root-finding and hill-climbing algorithms that are useful for solving estimating equations or maximizing artificial likelihoods, starting with a basic technique known as the iterative substitution. Methods such as the Newton-Raphson and the quasi-Newton algorithms are motivated as attempts to improve the rate of convergence of the iterative substitution. The contractive mapping theorem, which provides general conditions for the convergence of a multiparameter algorithm, is stated and proved. The EM-algorithm is described in generality and illustrated with examples. Aitken's method for accelerating linear convergence of algorithms is developed, along with a refinement known as Steffensen's method. Other methods discussed in this chapter include the method of false positions, Muller's method, methods particularly suitable for solving polynomial equations (such as the Bernoulli's method, the quotient-difference algorithm, Sturm's method and the QR-algorithm), the Nelder-Mead algorithm, and the method of Jacobi iteration for approximate inversion of matrices.
Keywords: Aitken's method, Bernoulli's method, EM-algorithm, iterative substitution, Muller's method, quotient-difference algorithm, Nelder-Mead algorithm, Newton-Raphson algorithm, Sturm's method, QR-algorithm, Steffensen's method
Oxford Scholarship Online requires a subscription or purchase to access the full text of books within the service. Public users can however freely search the site and view the abstracts and keywords for each book and chapter.
If you think you should have access to this title, please contact your librarian.