0% found this document useful (0 votes)
48 views3 pages

Notes Order Converge

The document discusses the order of convergence of iterative methods for finding roots. It defines the order of convergence and gives conditions for methods to have exact orders of 1 (linear) or higher. Newton's method is shown to have order 2 convergence when the root is simple, and order 1 convergence when the root has higher multiplicity. Steffensen's method is also discussed and shown to have at least order 2 convergence. Exercises explore these ideas further.

Uploaded by

marciepedroza
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
48 views3 pages

Notes Order Converge

The document discusses the order of convergence of iterative methods for finding roots. It defines the order of convergence and gives conditions for methods to have exact orders of 1 (linear) or higher. Newton's method is shown to have order 2 convergence when the root is simple, and order 1 convergence when the root has higher multiplicity. Steffensen's method is also discussed and shown to have at least order 2 convergence. Exercises explore these ideas further.

Uploaded by

marciepedroza
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Notes on Order of Convergence

Suppose an iterative process generates a sequence of points {xn } that converges to a root s of the function f . In the unit on Iterative Analysis, the concept of order of convergence was introduced with the following denition. Denition 3.1 Let en = xn s be the error at the n-th step. Then convergence is said to be at least linear (or rst order) if |en+1 | c|en | for all n N , for some integer N , and some constant c : 0 c < 1 . Further, for p > 1, convergence is said to be at least of order p, if |en+1 | C|en |p for all n N, and some constant C. This denition states only the conditions for a rate of convergence to be at least order p. However, we can often say more, for if it can be shown that |xn+1 s| = C = 0 , n |xn s|p lim with C < 1 when p = 1, then the rate of convergence is exactly p. Thus, for instance, for a xed point iteration xn+1 = (xn ), where is continuously dierentiable, and 0 < | (s)| < 1, the mean value theorem tells us that the rate of convergence is exactly one, with C = | (s)|. More generally, if the iteration function is p-times continuously dierentiable, and (s) = (s) = = (p1) (s) = 0 , with p (s) = 0 , then the rate of convergence is exactly order p. Indeed, from Taylors expansion, we can write xn+1 s = (xn ) (s) = 1 (p) (n )(xn s)p , p!

where n xn , s. Thus, the rate of convergence is exactly p, and C = (1/p !)|(p) (s)|. Example 1 In this example, we refer to the unit on Iterative Analysis. It was shown in Example 3.2 that the rate of convergence of Newtons method is at least second order when f (s) = 0 at the root s. We could have seen 1

this result also by using the analysis above and recalling that (s) = 0, as was shown in Example 2.4. Moreover, as seen from the last equation in Example 3.2, the rate of convergence is exactly two if f (s) = 0 and f (s) = 0, conditions that are likely. Example 2 Suppose Newtons method is applied to a function f that has a root of multiplicity two at x = s. To have a root of multiplicity two means that f (x) = (x s)2 g(x), where g is a continuous function with g(s) = 0. Assuming that g is suciently dierentiable, then f (x) = 2(x s)g(x) + (x s)2 g (x), and the Newton iteration function becomes (x) = x Therefore, g(x) d (x) = 1 (x s) (x) 2g(x) + (x s)g dx

(x s)g(x) . 2g(x) + (x s)g (x) ( ) .

g(x) 2g(x) + (x s)g (x)

Hence, (s) = 1 (1/2) = 1/2. We see therefore that the rate of convergence of Newtons method to a root of multiplicity two is linear, with c = 1/2. It can be shown more generally, that if the root has multiplicity p > 1, the rate of convergence of Newtons method is again linear, but with c = 1 (1/p).

Exercises
1. A function has a root of multiplicity p 1 at x = s if f (x) = (x s)p g(x) , where g is a continuous function and g(s) = 0. Suppose that s is a root of multiplicity p > 1. (a) Following Example 2, show that the rate of convergence of Newtons method to this root is linear. (b) Show that the following modication of Newtons iteration xn+1 = xn p 2 f (xn ) , f (xn )

has rate of convergence two. (c) If the multiplicity of the root s is not known, then Newtons method could be applied to the function G(x) = f (x)/f (x). Assuming g is dierentiable, show that G(x) = (x s)g(x) , pg(x) + (x s)g (x)

and hence that s is a root of G with multiplicity one. 2. Consider solving the equation x = ex . (a) Apply Steensens iteration to the xed-point iteration xn+1 = (xn ), where (x) = ex . Take x0 = 1. Viewing the computed iterates, does it appear that the rate of convergence is about two? (b) Next apply Steensens iteration to the xed-point iteration xn+1 = (xn ), where (x) = ln(x) . Take x0 = 0.6 . Note that even though this second xed-point iteration is replusive, Steensens iteration is locally attractive. Viewing the computed iterates, does it appear that the rate of convergence is about two? 3. For a xed-point iteration xn+1 = (xn ), show that Steensens iteration can be written as a xed-point iteration yn+1 = S(yn ), where S(y) = y ((y) y)2 . ((y)) 2(y) + y

By the way, using this result, and writing (x) in the form (x) = s + (x s)g(x), it can be shown that Steensens iteration has rate of convergence at least two.

You might also like