Post Reply 
Heads up for a hot new root seeking algorithm!!
01-10-2017, 06:30 PM
Post: #1
Heads up for a hot new root seeking algorithm!!
The famous Russian mathematician Ostrowski, who taught for many years at the University of Basil, Switzerland, proposedf an enhancement to Newton’s root seeking algorithm. Ostrowski suggested that each iteration offers two refinements for the root—one of them being intermediate per each iteration. The Ostrowski algorithm matches Halley’s root seeking algorithm in its third order rate of convergence. Recently, the Ostrowski algorithm inspired many mathematicians to device root-seeking algorithms with two or more refinements to the root per iteration.

I recently applied Ostrowski’s approach to the Illinois algorithm (an improved version of the False Position algorithm) and was able to obtain better rates of convergence than with the Illinois algorithm. I posted the pseudo-code for this algorithm on this web site. I was a little baffled as to why Ostrowski improved only the Newton’s method and did not become more ambitious to enhance Halley’s method!

A few days ago, I decided to experiment with applying Ostrowski’s approach to Halley’s algorithm. Since the latter method is a bit more advanced than Newton’s method (requiring the calculations of approximations for the first AND second derivatives), applying the Ostrowski approach was NOT trivial. I decided, nevertheless, to give it a go. I started with a simple improvement to Halley’s method, but that did not yield better calculations. After two or three incarnations, I was able to find a satisfactory marriage between Ostrowski and Halley. I plan to publish a report on my website and include a comparison between the methods of Newton, Halley, Ostrowski, and my new Ostrowski-Halley method. The results will include testing these algorithms with two dozen functions and reporting the number of function calls AND iterations.

I am happy to report that the new Ostrowski-Halley method competes well with Halley’s method and its cousin the Ostrowski method. The competition is stiff between these three algorithms., but the Ostrowski-Halley method shows good promise. Stay tuned for my announcement of posting a paper on this subject in my personal web site.

Namir
Find all posts by this user
Quote this message in a reply
01-10-2017, 08:06 PM
Post: #2
RE: Heads up for a hot new root seeking algorithm!!
Hi Namir, very promising! Do you have any stict proof of convergence? Also, interesting woul be an estimate for the achievable convergence rate (and how this depends on type/smoothness etc. of the function to be "rooted").

Anyways, am curious to read your paper and the comparison with the original algorithms.

Juergen
Find all posts by this user
Quote this message in a reply
01-10-2017, 09:54 PM
Post: #3
RE: Heads up for a hot new root seeking algorithm!!
(01-10-2017 06:30 PM)Namir Wrote:  The famous Russian mathematician Ostrowski, who taught for many years at the University of Basil, Switzerland, proposedf an enhancement to Newton’s root seeking algorithm. Ostrowski suggested that each iteration offers two refinements for the root—one of them being intermediate per each iteration. The Ostrowski algorithm matches Halley’s root seeking algorithm in its third order rate of convergence. Recently, the Ostrowski algorithm inspired many mathematicians to device root-seeking algorithms with two or more refinements to the root per iteration.

I recently applied Ostrowski’s approach to the Illinois algorithm (an improved version of the False Position algorithm) and was able to obtain better rates of convergence than with the Illinois algorithm. I posted the pseudo-code for this algorithm on this web site. I was a little baffled as to why Ostrowski improved only the Newton’s method and did not become more ambitious to enhance Halley’s method!

A few days ago, I decided to experiment with applying Ostrowski’s approach to Halley’s algorithm. Since the latter method is a bit more advanced than Newton’s method (requiring the calculations of approximations for the first AND second derivatives), applying the Ostrowski approach was NOT trivial. I decided, nevertheless, to give it a go. I started with a simple improvement to Halley’s method, but that did not yield better calculations. After two or three incarnations, I was able to find a satisfactory marriage between Ostrowski and Halley. I plan to publish a report on my website and include a comparison between the methods of Newton, Halley, Ostrowski, and my new Ostrowski-Halley method. The results will include testing these algorithms with two dozen functions and reporting the number of function calls AND iterations.

I am happy to report that the new Ostrowski-Halley method competes well with Halley’s method and its cousin the Ostrowski method. The competition is stiff between these three algorithms., but the Ostrowski-Halley method shows good promise. Stay tuned for my announcement of posting a paper on this subject in my personal web site.

Namir

Keep them coming. I hope you don't mind if one or more of your algorithms end up implemented in newRPL (proper credit will be given, of course!). Always interesting to discover something new in a subject that seemed closed already.
Find all posts by this user
Quote this message in a reply
01-10-2017, 10:14 PM
Post: #4
RE: Heads up for a hot new root seeking algorithm!!
Since it's on-topic:

What do you think about more advanced methods like this one:

http://ojs.excelingtech.co.uk/index.php/...ew/430/294


There's a lot of operations in each iteration, but smaller number of evaluations to find the root. Considering that newRPL has relatively slow CORDIC transcendental functions, evaluation is heavy and one of these could perhaps be beneficial, but have you ever tested if it's actually worth it? Sometimes they just want to prove a theoretical point, but they are not really better than plain Newton.
Find all posts by this user
Quote this message in a reply
01-11-2017, 08:05 PM (This post was last modified: 01-11-2017 08:19 PM by Namir.)
Post: #5
RE: Heads up for a hot new root seeking algorithm!!
(01-10-2017 10:14 PM)Claudio L. Wrote:  Since it's on-topic:

What do you think about more advanced methods like this one:

http://ojs.excelingtech.co.uk/index.php/...ew/430/294


There's a lot of operations in each iteration, but smaller number of evaluations to find the root. Considering that newRPL has relatively slow CORDIC transcendental functions, evaluation is heavy and one of these could perhaps be beneficial, but have you ever tested if it's actually worth it? Sometimes they just want to prove a theoretical point, but they are not really better than plain Newton.

I looked at the article and my new method does not involve as much calculations per iterations! I think my new algorithm has the same order of convergence as Halley, which is third order.

Also the author of teh article is improving on Ostrowski (who improved on Newton). My new algorithms improves on Halley's method using Ostroskwi's approach.

Anyway, thanks for the articles. A few years ago I did a survey of recent algorithms that were inspired by Ostrowski. I found algorithms with three or more intermediate refinements for the root per iterations. While the number of iterations of these algorithms decreased (compared to Newton or Halley), the number of function calls went up! So I am conscious that the number of function calls, per iteration, should not get out of hand.

Namir
Find all posts by this user
Quote this message in a reply
01-17-2017, 09:13 PM
Post: #6
RE: Heads up for a hot new root seeking algorithm!!
(01-11-2017 08:05 PM)Namir Wrote:  
(01-10-2017 10:14 PM)Claudio L. Wrote:  ... but have you ever tested if it's actually worth it? Sometimes they just want to prove a theoretical point, but they are not really better than plain Newton.

I think my new algorithm has the same order of convergence as Halley, which is third order.

But again Namir, as you wrote "you think it is of 3rd order", but do you know for sure by means of a strict mathematical proof? Mathematics can't go without that ;-) A modification of a 3rd order algorithm does not necessarily result in an algorithm of the same convergence behavior. Did you try to adopt the proves of Halley/Ostrowski to your new algorithm? This would indeed make your work complete!

Juergen
Find all posts by this user
Quote this message in a reply
01-17-2017, 10:42 PM
Post: #7
RE: Heads up for a hot new root seeking algorithm!!
(01-17-2017 09:13 PM)JurgenRo Wrote:  [quote='Namir' pid='66624' dateline='1484165132']

I think my new algorithm has the same order of convergence as Halley, which is third order.

But again Namir, as you wrote "you think it is of 3rd order", but do you know for sure by means of a strict mathematical proof? Mathematics can't go without that ;-) A modification of a 3rd order algorithm does not necessarily result in an algorithm of the same convergence behavior. Did you try to adopt the proves of Halley/Ostrowski to your new algorithm? This would indeed make your work complete!

Juergen
[/quote

The new algorithm matches or slightly improve on Halley. Sicne Halley is third order, the new algorithm could not be second or fourth order. I am saying 3rd order by induction.

Namir
Find all posts by this user
Quote this message in a reply
01-18-2017, 05:47 AM (This post was last modified: 01-18-2017 08:39 AM by Ángel Martin.)
Post: #8
RE: Heads up for a hot new root seeking algorithm!!
(01-17-2017 10:42 PM)Namir Wrote:  The new algorithm matches or slightly improve on Halley. Sicne Halley is third order, the new algorithm could not be second or fourth order. I am saying 3rd order by induction.

Namir

And how do you define "it matches"? what criteria is used? What parameters? In what cases? all possible expressions??
IMHO induction is not applicable here...

Cheers,
ÁM
Find all posts by this user
Quote this message in a reply
01-18-2017, 09:47 AM
Post: #9
RE: Heads up for a hot new root seeking algorithm!!
The paper pointed by Claudio shows how the authors use a numerical computation to ascertain the order of convergence of the algorithm. I think that such approach can be used here to estimate the order of convergence of Namir's algorithm not needing a, perhaps cumbersome, rigorous demonstration. Provided, of course, you have a math software capable to work with high precision numbers (say Mathematica).

Knowing the order of convergence, the efficiency index can be calculated, perhaps a better metric to compare algorithms.

In any way, AFAIK, the basic (Newton based) Ostrowsky's algorithm has a convergence order of 4, not 3.
Find all posts by this user
Quote this message in a reply
01-18-2017, 08:22 PM
Post: #10
RE: Heads up for a hot new root seeking algorithm!!
(01-18-2017 05:47 AM)Ángel Martin Wrote:  
(01-17-2017 10:42 PM)Namir Wrote:  The new algorithm matches or slightly improve on Halley. Sicne Halley is third order, the new algorithm could not be second or fourth order. I am saying 3rd order by induction.

Namir

And how do you define "it matches"? what criteria is used? What parameters? In what cases? all possible expressions??
IMHO induction is not applicable here...

Cheers,
ÁM

I fully agree with Ángels rating. I do not see how Induction could do the work here. Also, the convergence rate might (and likely will) depend on the properties of the function to be "rooted" (smoothness for example). That is why a "demonstration" of convergence with a couple of functions is not reliable at all. Then again it would be legitimate to say, that you guess that the scheme is of 3rd order (for sufficient smooth?) functions but that a rigorous proof is still missing ...

Juergen
Find all posts by this user
Quote this message in a reply
01-18-2017, 11:53 PM
Post: #11
RE: Heads up for a hot new root seeking algorithm!!
My study compared the number of iterations and functions for the method of Newton, Halley, Ostrowski, and my new algorithm. I used two dozen tes functions. The last three method generally had close number of iterations and function calls. Since Halley has been discussed on several books (and Wikipedia) to be third order, then Ostrowski and my new algorithm should have the same convergence rate. When you compare Newton's method with the other three, you can see that Newton's method is slower.

I am not sure that Ostrowski's method has an order 4 of convergence. Any reference that confirm this?

Namir
Find all posts by this user
Quote this message in a reply
01-19-2017, 03:23 AM
Post: #12
RE: Heads up for a hot new root seeking algorithm!!
(01-18-2017 11:53 PM)Namir Wrote:  I am not sure that Ostrowski's method has an order 4 of convergence. Any reference that confirm this?

Namir

The paper I cited, shows at the very top on page 78 (it's actually the third page) that plain Newton has order 2, Ostrowski has order 4, and another formula has order 8. It says "it's well established", so I guess it must be true.
Find all posts by this user
Quote this message in a reply
01-19-2017, 05:26 AM (This post was last modified: 01-19-2017 05:36 AM by Namir.)
Post: #13
RE: Heads up for a hot new root seeking algorithm!!
(01-19-2017 03:23 AM)Claudio L. Wrote:  
(01-18-2017 11:53 PM)Namir Wrote:  I am not sure that Ostrowski's method has an order 4 of convergence. Any reference that confirm this?

Namir

The paper I cited, shows at the very top on page 78 (it's actually the third page) that plain Newton has order 2, Ostrowski has order 4, and another formula has order 8. It says "it's well established", so I guess it must be true.

Yes the paper states, on page 78, that Ostrowski's convergence is of order 4. The author is mistaken!! I tested 24 functions and compared Newton, Halley, Ostrowski, and my new Ostrowski-Halley method. The methods of Halley and Ostrowski often found the root in the same number of iterations. If Ostrowski's method is order 4, I didn't see it in the test results. My new algorithm did, in general a little bit better than Halley and Ostrowski (after all it's a combination of these two methods).

Many mathematicians have, somewhat recently, improved on Ostrowski's method to yield algorithms with high to very high convergence rates. That's nice and dandy, but it comes at the cost of very high number of function calls, making the simpler (and slower converging) algorithms more practical if one regards the number of function calls as the cost of doing business in finding roots.

Namir
Find all posts by this user
Quote this message in a reply
01-19-2017, 10:44 AM (This post was last modified: 01-19-2017 11:12 AM by emece67.)
Post: #14
RE: Heads up for a hot new root seeking algorithm!!
(01-19-2017 05:26 AM)Namir Wrote:  Yes the paper states, on page 78, that Ostrowski's convergence is of order 4. The author is mistaken!!

http://folk.uib.no/ssu029/Pdf_file/Varona02.pdf states that the order of convergence of Newton's is 2, Halley's 3 and Ostrowsky's 4.

http://www.sciencedirect.com/science/art...2706006716 states that the basic Ostrowsky's method has order of convergence 4.

http://www.sciencedirect.com/science/art...2111008078 states again that the order of convergence of Ostrowsky method is 4. It also says that the Ostrowsky's method is optimal in the sense that it satisfies the Kung-Traub conjecture (which states that the maximum attainable order of convergence for an algorithm with \(n\) function evaluations is \(2^{n-1}\), neither Halley's nor Newton's are optimal in such sense.

I'm not sure about the first reference, but the other two are peer reviewed publications, so I assume them to be safe sources.

I cannot find now your algorithms description (didn't they were in this thread or in another one?) so, how many function evaluations do they need on each iteration?

Obtaining the computational order of convergence COC of your algorithm seems not to be a hard task (only 3 iterations are needed for some test function/root combinations). Such COC combined with the number of function evaluations needed on each iteration will give us all a much better indication of your algorithm performance.

Regards.

Edit: I located your algorithm in the other thread (sorry I forgot it). It needs four function evaluations on each iteration. Supposing its order of convergence is 4, its efficiency index is \(\sqrt[4]{4}=1.4142...\). As Ostrowsky's algorithm only requires 3 function evaluations on each iteration and it has an order of convergence of 4, its efficiency index is \(\sqrt[3]{4}=1.5874...\), which is better. If the order of convergence of your algorithm is not 4, but 3, then the situation is even worse. To compete with the Ostrowsky's algorithm (in terms of the efficiency index) we need your algorithm to be of order 6-7. For comparison, the efficiency index of Newton's is also 1.4142... and that for Halley is 1.44225...
Find all posts by this user
Quote this message in a reply
01-19-2017, 04:09 PM (This post was last modified: 01-19-2017 06:07 PM by Namir.)
Post: #15
RE: Heads up for a hot new root seeking algorithm!!
(01-19-2017 10:44 AM)emece67 Wrote:  
(01-19-2017 05:26 AM)Namir Wrote:  Yes the paper states, on page 78, that Ostrowski's convergence is of order 4. The author is mistaken!!

http://folk.uib.no/ssu029/Pdf_file/Varona02.pdf states that the order of convergence of Newton's is 2, Halley's 3 and Ostrowsky's 4.

http://www.sciencedirect.com/science/art...2706006716 states that the basic Ostrowsky's method has order of convergence 4.

http://www.sciencedirect.com/science/art...2111008078 states again that the order of convergence of Ostrowsky method is 4. It also says that the Ostrowsky's method is optimal in the sense that it satisfies the Kung-Traub conjecture (which states that the maximum attainable order of convergence for an algorithm with \(n\) function evaluations is \(2^{n-1}\), neither Halley's nor Newton's are optimal in such sense.

I'm not sure about the first reference, but the other two are peer reviewed publications, so I assume them to be safe sources.

I cannot find now your algorithms description (didn't they were in this thread or in another one?) so, how many function evaluations do they need on each iteration?

Obtaining the computational order of convergence COC of your algorithm seems not to be a hard task (only 3 iterations are needed for some test function/root combinations). Such COC combined with the number of function evaluations needed on each iteration will give us all a much better indication of your algorithm performance.

Regards.

Edit: I located your algorithm in the other thread (sorry I forgot it). It needs four function evaluations on each iteration. Supposing its order of convergence is 4, its efficiency index is \(\sqrt[4]{4}=1.4142...\). As Ostrowsky's algorithm only requires 3 function evaluations on each iteration and it has an order of convergence of 4, its efficiency index is \(\sqrt[3]{4}=1.5874...\), which is better. If the order of convergence of your algorithm is not 4, but 3, then the situation is even worse. To compete with the Ostrowsky's algorithm (in terms of the efficiency index) we need your algorithm to be of order 6-7. For comparison, the efficiency index of Newton's is also 1.4142... and that for Halley is 1.44225...

Thank you for the interesting information. I will look at the various articles you mentioned in your links.

I have a question for you. Given a function f(x) with an initial guess x0 and a specified tolerance. If I apply root-seeking algorithm AL1, which has an efficient index EI1, and it takes N1 iterations to refine the root. Can I predict the number of iterations N2 for another algorithm AL2 with an efficiency index EI2? If we can do that, then we can estimate the efficiency index EI3 of algorithm AL3, compared with algorithm AL1, given the number of iterations N1 and N3 for both methods (which the initial guess and tolerance being the same) and an Efficiency Index EF1 for AL1.

Namir
Find all posts by this user
Quote this message in a reply
01-19-2017, 08:13 PM (This post was last modified: 01-19-2017 08:42 PM by JurgenRo.)
Post: #16
RE: Heads up for a hot new root seeking algorithm!!
(01-19-2017 03:23 AM)Claudio L. Wrote:  
(01-18-2017 11:53 PM)Namir Wrote:  I am not sure that Ostrowski's method has an order 4 of convergence. Any reference that confirm this?

Namir

The paper I cited, shows at the very top on page 78 (it's actually the third page) that plain Newton has order 2, Ostrowski has order 4, and another formula has order 8. It says "it's well established", so I guess it must be true.

You have to distinguish between local- and global error. Plain Newton is simply Tayler series Expansion of 1st order:

f(x_(n+1)) = f(x_n) + hf'(x_n) + (1/2)h^2f''(ceta),

where h= x_(n+1)-x_n, ceta in (x_n+1,x_n) and R:=(1/2)h^2f''(ceta) is the term defining the local error (of each step). That is, locally plain Newton is o(h^2), i.e. quadratic. The global error though (the accumulated error of all N steps ) is o(Nh^2)=o(h), i.e. linear behavior. Normally you are interested in the total accumulated error, i.e. in the global error. So to my understanding plain Newton is a linear scheme.

Juergen
Find all posts by this user
Quote this message in a reply
01-20-2017, 01:24 AM
Post: #17
RE: Heads up for a hot new root seeking algorithm!!
Using Taylor series can easily explain the order of convergence for Newton, Halley, and Halston. But as the algorithms becomes more complication, generating one or more additional intermediate guesses per iteration, this get complicated.

I was asking in a previous message whether the number of iterations can be estimated using the efficiency index, if we know the number of iterations achieved by another algorithm with a known efficiency index--- for solving the root of the same function, using the same initial guess and tolerance value for the refined root.

My own perception is that the efficiency index is a qualitative indicator that gives you a general idea about the convergence rate. I doubt there exists a general formula for what I am asking above.

Namir
Find all posts by this user
Quote this message in a reply
01-20-2017, 08:59 AM
Post: #18
RE: Heads up for a hot new root seeking algorithm!!
(01-20-2017 01:24 AM)Namir Wrote:  My own perception is that the efficiency index is a qualitative indicator that gives you a general idea about the convergence rate. I doubt there exists a general formula for what I am asking above.

That's also my perception.

Regarding your question, seeing that algorithms can behave very differently (even with different orders of convergence) upon different functions and even around different roots of the same function, I also doubt that there's a way to ascertain the required number of iterations for a given algorithm to converge to a given root of a given function.

I think that what we can do is to build tables such that given here http://article.sapub.org/10.5923.j.ajcam...01.03.html in order to "have a feeling" about the behaviour of an algorithm compared to others.

Regards.
Find all posts by this user
Quote this message in a reply
01-20-2017, 09:27 AM (This post was last modified: 01-20-2017 09:38 AM by Namir.)
Post: #19
RE: Heads up for a hot new root seeking algorithm!!
If you download the ZIP file from my web site, you get the report and an Excel file that has worksheets for the dozen test functions. What I observed is that in the majority of the cases, the methods of Halley and Ostrowski give the same results or don't differ by more than one iteration (both require three function calls per iteration). The results DO SHOW a consistent improvement over Newton's method. You can say that the methods of Halley and Ostrowski have a higher convergence rate than that of Newton. My new algorithm (which uses the Ostrowski's approach to enhance Halley) shows several cases where either the number of iterations or both the number of iterations AND number of function calls is less than that of Halley and Ostrowski. Since I developed the new algorithm using an empirical/heuristic approach I did not yield a long set of mathematical derivations that indicate the convergence rate. It may well be at least one order more than that of Ostrowski's method. It's hard to measure ... compared to, say determining the order of array sorting methods where you can widely vary the array size and calculate the number of array element comparisons and number of element swaps. See my article about enhancing the CombSort method where I was able to calculate the sort order of this method in Tables 3 and 4 of the article.
Find all posts by this user
Quote this message in a reply
01-21-2017, 02:35 PM (This post was last modified: 01-21-2017 02:37 PM by Namir.)
Post: #20
RE: Heads up for a hot new root seeking algorithm!!
(01-20-2017 08:59 AM)emece67 Wrote:  
(01-20-2017 01:24 AM)Namir Wrote:  My own perception is that the efficiency index is a qualitative indicator that gives you a general idea about the convergence rate. I doubt there exists a general formula for what I am asking above.

That's also my perception.

Regarding your question, seeing that algorithms can behave very differently (even with different orders of convergence) upon different functions and even around different roots of the same function, I also doubt that there's a way to ascertain the required number of iterations for a given algorithm to converge to a given root of a given function.

I think that what we can do is to build tables such that given here http://article.sapub.org/10.5923.j.ajcam...01.03.html in order to "have a feeling" about the behaviour of an algorithm compared to others.

Regards.

Regarding the link to the article you mentioned in your message. Can you check the new algorithm by the author Thukral. I implemented equation 7 in the article and got bizarre results! Perhaps I am doing something wrong? When I replaced the second subtraction in equation 7 with a multiplication, the algorithm worked but was painfully slow to converge!

I suspect typos om the article since the title has one " Nonlinear Equations of Type f(0)=0" instead of " Nonlinear Equations of Type f(x)=0".

Namir
Find all posts by this user
Quote this message in a reply
Post Reply 




User(s) browsing this thread: 1 Guest(s)