Post Reply 
On Convergence Rates of Root-Seeking Methods
03-08-2018, 01:21 AM (This post was last modified: 03-11-2018 01:09 PM by emece67.)
Post: #23
RE: On Convergence Rates of Root-Seeking Methods
(03-07-2018 10:32 PM)Mike (Stgt) Wrote:  It's awesome, at least for those who are not awfully bored by this subject.

Yes, it is awesome!

I've also made my own tests, in my case in Python with the mpmath package for the arbitrary precision arithmetic. I've tested the algorithms (all are generic root seeking algorithms):
  • Halley (3rd order convergence, 1 division)
  • Grau+Díaz-Barrero (6th order, 2 divisions)
  • Newton-Raphson (2nd order, 1 division)
  • Ostrowsky-Traub (4th order, 2 divisions)
  • Kung-Traub (8th order, 3 divisions, but more complex)
  • Potra-Ptak (3rd order, 2 divisions)

I've not used (by now) sliding accuracy, but fixed ones at 10000, 100000 and 1 million digits with only 2 arguments: 2 and 17. Tested only sqrt().

My findings are that, as in this case the computation of the function to be solved requires only one multiplication and one subtraction, but the algorithm requires one or more divisions, being divisions more time consuming at such large digit counts, the usual parameter used to compare root-seeking algorithm speeds, the efficiency index \(\gamma=\sqrt[r]p\) (being \(p\) the convergence order of the method and \(r\) the number of function evaluations needed on each iteration) is not useful, as function evaluations are simpler than other operations needed by the algorithm.

Defining instead a "division-aware efficiency index" as \(\gamma_d=\sqrt[d]p\) (being \(d\) the number of divisions needed on each iteration) one gets a reliable predictor of the speed of the algorithm. Computing this index for the above algorithms one gets:
  • Halley: 3
  • Grau+Díaz-Barrero: 2.45
  • Newton-Raphson: 2
  • Ostrowsky-Traub: 2
  • Kung-Traub: 2
  • Potra-Ptak: 1.7
so Halley is the fastest, then Grau+Díaz-Barrero, then Newton-Raphson, Ostrowsky-Traub and Kung-Traub, with Potra-Ptak the slowest. And this is exactly what I've found, only being Kung-Traub slower than expected (but still faster than Potra-Ptak) because, I think, of its higher complexity.

So the moral is, using this kind of classic root-seeking algorithms to compute square roots, use one with a high convergence order, using few divisions and not too cumbersome (this is the reason I added the Grau+Díaz-Barrero to my tests).

Now I will check if the Taylor method used by Gerson can be used to get a 4th order method with only one division. Such a method, if not too complex, may be even faster than Halley.

In my tests, the more the number of digits used, the higher the correlation between \(\gamma_d\) and the speed of the method.

For the curious, it took 218 s for Halley's algorithm to compute \(\sqrt2\) to 1 million digits (Intel Core-i7@2.7 GHz, Win10).

Regards.
Find all posts by this user
Quote this message in a reply
Post Reply 


Messages In This Thread



User(s) browsing this thread: 1 Guest(s)