Post Reply 
Heads up for a hot new root seeking algorithm!!
01-23-2017, 11:11 AM (This post was last modified: 01-23-2017 11:12 AM by Dieter.)
Post: #21
RE: Heads up for a hot new root seeking algorithm!!
(01-21-2017 02:35 PM)Namir Wrote:  Regarding the link to the article you mentioned in your message. Can you check the new algorithm by the author Thukral. I implemented equation 7 in the article and got bizarre results! Perhaps I am doing something wrong? When I replaced the second subtraction in equation 7 with a multiplication, the algorithm worked but was painfully slow to converge!

Ad fontes! Always use the original source, i.e. the paper mentioned in footnote 7.

Indeed there is an error in equation (20) of the quoted text. The correct formula has the derivative \(f'(x_n)^2\) as the first term of the first nominator, and not the function \(f(x_n)^2\) as shown in the linked article.

Dieter
Find all posts by this user
Quote this message in a reply
01-23-2017, 02:22 PM
Post: #22
RE: Heads up for a hot new root seeking algorithm!!
(01-21-2017 02:35 PM)Namir Wrote:  I suspect typos om the article since the title has one " Nonlinear Equations of Type f(0)=0" instead of " Nonlinear Equations of Type f(x)=0".

Such Thukral algorithm is more or less useless, as it only works for zero roots. It states in the abstract that it can be used solely when the root is zero. Thus, the title f(0) = 0 is correct, the algorithm only converges to a root (yeah, with order 3 and only one function evaluation per iteration) if such root is zero.

I posted it just for the tables, not for the algorithm.

I'm working now on some Python code to test different algorithms. Up to now I was able to check that the Newton algorithm has indeed a 2nd order of convergence (I checked it working with 16, 32, 64, 128, 256, 512 & 1024 decimal digits). I'll test it with more equations and also will check the Halley & Ostrowsky methods. If all goes right, I can check your algorithm.

Regards.
Find all posts by this user
Quote this message in a reply
01-23-2017, 06:23 PM
Post: #23
RE: Heads up for a hot new root seeking algorithm!!
(01-23-2017 02:22 PM)emece67 Wrote:  Such Thukral algorithm is more or less useless, as it only works for zero roots. It states in the abstract that it can be used solely when the root is zero. Thus, the title f(0) = 0 is correct, the algorithm only converges to a root (yeah, with order 3 and only one function evaluation per iteration) if such root is zero.

We are talking about two different methods here. The paper on sapub.org shows a current (2016) paper by R. Thukral, quoting an older paper of the same author from 2008. Here equation [20] is the 2008 Thukral formula (with an error). This formula works for nonzero roots. Because this older Thukral formula [20] has an error Namir does not get the expected results. After a correction (cf. the original source, i.e. the 2008 Thukral paper) the formula works fine and performs similar to the other posted third-order methods, sometimes even better.

Dieter
Find all posts by this user
Quote this message in a reply
01-23-2017, 09:36 PM
Post: #24
RE: Heads up for a hot new root seeking algorithm!!
(01-23-2017 06:23 PM)Dieter Wrote:  
(01-23-2017 02:22 PM)emece67 Wrote:  Such Thukral algorithm is more or less useless, as it only works for zero roots. It states in the abstract that it can be used solely when the root is zero. Thus, the title f(0) = 0 is correct, the algorithm only converges to a root (yeah, with order 3 and only one function evaluation per iteration) if such root is zero.

We are talking about two different methods here. The paper on sapub.org shows a current (2016) paper by R. Thukral, quoting an older paper of the same author from 2008. Here equation [20] is the 2008 Thukral formula (with an error). This formula works for nonzero roots. Because this older Thukral formula [20] has an error Namir does not get the expected results. After a correction (cf. the original source, i.e. the 2008 Thukral paper) the formula works fine and performs similar to the other posted third-order methods, sometimes even better.

Dieter

The equation Namir used was (7) in the Thukral's 2016 paper. Such equation only converges to zero roots (as stated in the abstract and conclusions). In the same paper there's a reference to another "Thukral method" (equations (19)-(20)) which is the one in the 2008 paper by the same author. Equation (20) has an error (in the 2016 paper), but it is not related to equation (7), which is new.

Regards.
Find all posts by this user
Quote this message in a reply
01-23-2017, 10:53 PM
Post: #25
RE: Heads up for a hot new root seeking algorithm!!
I am conducting a study on the robustness of the convergence rates calculated by various mathematicians for their new root-seeking algorithms and also for well-know root-seeking methods like Newton and Halley. I am comparing the order of convergence with actual values for various sample non-linear functions.

Stay tuned ...........

:-)

Namir
Find all posts by this user
Quote this message in a reply
01-24-2017, 08:30 PM (This post was last modified: 01-24-2017 08:33 PM by emece67.)
Post: #26
RE: Heads up for a hot new root seeking algorithm!!
(01-24-2017 07:05 PM)Mike (Stgt) Wrote:  After some tests I am able to answer my question on my own. The classical Newton wins the race for square and forth root, the later even when computing it by Sqrt(Sqrt(x)).

In my case, to compute 4th roots to 10000 digits the Newton-Raphson method needed around 15 iterations (30 function evaluations) whereas the Ostrowsky-Traub uses 8 iterations (24 function evaluations). The OT method requires also less time (75.6 ms vs 81.4 ms). I'm working in Python with the mpmath package.

I've also checked the Halley method (10 iterations, 30 function evaluations, 78.7 ms, still faster than NR) and the Potra-Ptak method (the 3rd order one, 11 iterations, 33 functions evaluations, 101.6 ms, the slower one, curious, as this method seems quite simple).

I've not found such dramatic speed-ups between methods as you have found, the difference between my fastest (OT) and slowest (PP) is just 1.35x. No "sliding accuracy" in my case.

Regards.
Find all posts by this user
Quote this message in a reply
01-26-2017, 08:21 PM (This post was last modified: 01-26-2017 08:30 PM by emece67.)
Post: #27
RE: Heads up for a hot new root seeking algorithm!!
These are my findings so far.

Legend:
  • Iter: number of iterations
  • TNFE: total number of function evaluations
  • COC: computational order of convergence
  • EI: efficiency index
  • Time: referenced to the fastest method

COC is rounded to the nearest integer, except for the \(x^6-1\) case.

Derivatives are always computed by means of the analytical expressions for them (not approximated with differences).

The Potra-Ptak method is a 3rd order method requiring two evaluations for f(x) and one for its derivative. The Kung-Traub method is the 8th order one.

Although I expected bigger differences between different methods, the Ostrowsky-Traub method seems a winner to me.

Regards.


Attached File(s)
.pdf  COC.pdf (Size: 21.69 KB / Downloads: 14)
Find all posts by this user
Quote this message in a reply
01-27-2017, 10:02 PM
Post: #28
RE: Heads up for a hot new root seeking algorithm!!
And these are using an approximate derivative. I think that the Ostrowsky-Traub method wins again.

Regards.


.pdf  COC_ad.pdf (Size: 21.26 KB / Downloads: 9)
Find all posts by this user
Quote this message in a reply
01-28-2017, 12:09 AM
Post: #29
RE: Heads up for a hot new root seeking algorithm!!
Polishing my study before I post it on the web. Hang in there.

Namir
Find all posts by this user
Quote this message in a reply
02-01-2017, 10:03 PM (This post was last modified: 02-01-2017 10:15 PM by Namir.)
Post: #30
RE: Heads up for a hot new root seeking algorithm!!
(02-01-2017 05:39 PM)Mike (Stgt) Wrote:  I did also invest few days to the subject but I do not publish my results as they are somehow 'unfair'. My focus is limited to solve a^2-z and a^4-z with enhanced precission no pocket calculator offers. The 'unfair' point is, that I compare the elapse time of 'Classical Newton' (CN) solving for sqare root with other methods, but use as x(n+1) = x(n) - f(x(n))/f'(x(n)) its evaluation a = .5 * (a + z / a). So per loop there are only three operations, a division, an addition and a multipliacation. I evaluated also CN of a^4-z, Ostrowski, Grau et al., Weerakoon et al., and Homeier. All have from few more to many more operations which need more time to calculate with the enhanced settings of accurancy (10'000 and more) than the order of convergence might save.

I did my tests on an Hercules-emulated mainframe with compiled REXX and with ooREXX on Windows. My assumption to find the same relations in elapse time did not hold true. While CN was always winner on the host, on the PC the root of 1.01 took twice as long than for 0.99, so other methods did better.

My conclution: i) REXX on PC is not the first choice for heavy number crunching (but for other tasks my favourite language) and ii) the 200LX or the corresponding Connectivity Pack is sufficient for day-to-day root solving, showing a graph too. I tested with it the functions shown in the papers and found no 'pathological' withal.

Ciao.....Mike

Oh wow! You use REXX and ooREXX?? I have them installed on my PC but never have the time to use them. I found 00REXX about twenty years ago! I had reviewed REXX for either BYTE magazine or Computer Language. It is a good batch language but not a number crunching language.

Counting the operations adds more insight to the CPU effort. Of course you have to exclude the operations used to evaluate the target function and its derivative(s) and instead count the number of times you call them. I usually use difference approximation for the first and second derivatives. I manage to issue the same number of total function calls as the number of calls to f(x), f'(x), and f''(x) if the latter is needed. Of course some functions have derivatives that have simpler expressions and some have derivatives with more elaborate expressions. "Wish it were so simple" (Paraphrasing from the movie "Hail Caesar").

Namir
Find all posts by this user
Quote this message in a reply
Post Reply 




User(s) browsing this thread: 1 Guest(s)