Side benefit from RungeKutta methods for ODE

09232018, 07:27 AM
Post: #1




Side benefit from RungeKutta methods for ODE
I have been looking at various methods to numerically solve ODEs (Ordinary Differential Equations). Among the various methods you can find (in Wikipedia) a family of RungeKutta methods. The 4th order RungeKutta method is perhaps the most popular. It has been used by HP in various calculator Stat Pacs.
The family of various RungeKutta methods solve ODE of: dy/dx = f(x,y) One interesting "side effect" is that if: dy/dx = f(x) Then you can obtain methods for numerical integration. The 4th order RungeKutta method, in this case, becomes Simpson's rule. The interesting windfall here is using the various variants of RungeKutta as additional methods for numerical integration. These variant methods are basically a "free bonus" method for numerical integration, which you can explore and use in performing numerical integration. Namir 

09242018, 06:54 AM
Post: #2




RE: Side benefit from RungeKutta methods for ODE
It goes the other way too. Integration methods lead to solutions to differential equations.
The predictorcorrector methods consist of using an "open" (no end point) method of integration to predict a solution and a "closed" point method to correct the result. 

09262018, 04:43 PM
Post: #3




RE: Side benefit from RungeKutta methods for ODE
Good to know that. I will check it out sometime in the future.
Namir 

09262018, 10:45 PM
Post: #4




RE: Side benefit from RungeKutta methods for ODE
.
Hi, Namir: (09232018 07:27 AM)Namir Wrote: [...] you can obtain methods for numerical integration. Sure you do. Quote:The 4th order RungeKutta method, in this case, becomes Simpson's rule. Simpson's rule is a 3rdorder method, i.e.: it gives exact results for polynomials up to the 3rd degree. Quote:The interesting windfall here is using the various variants of RungeKutta as additional methods for numerical integration. These variant methods are basically a "free bonus" method for numerical integration, which you can explore and use in performing numerical integration. You can but you'd be illadvised to do that because using RungeKutta methods to perform numerical integration of some f(x) is very inefficient, that's why no one uses them for that purpose and that's why no book on numerical analysis would recommend them or even mention the possibility. Why the inefficiency ? Well, taking as an example the 4th order Runge Kutta method you mention, it performs 4 evaluations of f(x) per step, and (being exactly equivalent to Simpson's rule) it's exact for polynomials only up to degree 3. On the other hand, using Gaussian integration, as in this small program here in the General Library: Gaussian integration will perform just 3 evaluations of f(x) and yet it's exact for polynomials up to degree 5, so you get much greater accuracy (5th order vs. 3rd order) while performing 25% less evaluations of f(x) (3 vs. 4), i.e.; also running faster. If in doubt, try the two examples in the linked article with a 4thorder RK implementation and see what precision you do get for 1 subinterval (1 step in RK). The same considerations apply to other RK methods. That's why using RungeKutta methods for integration is just wasting time and getting an inferior result. They simply weren't designed and aren't suited for that, there are much, much better and well known methods available, such as the aforementioned Gaussian integration and many others. Regards. V. . Find All My HPrelated Materials here: Valentin Albillo's HP Collection 

09272018, 02:11 AM
Post: #5




RE: Side benefit from RungeKutta methods for ODE
(09262018 10:45 PM)Valentin Albillo Wrote: Why the inefficiency ? Well, taking as an example the 4th order Runge Kutta method you mention, it performs 4 evaluations of f(x) per step, and (being exactly equivalent to Simpson's rule) it's exact for polynomials only up to degree 3. The 4 evaluations are: \(\begin{aligned} k_{1}&=f(t_{n},y_{n}),\\ k_{2}&=f\left(t_{n}+\frac{h}{2},y_{n}+\frac{h}{2}k_{1}\right),\\ k_{3}&=f\left(t_{n}+\frac{h}{2},y_{n}+\frac{h}{2}k_{2}\right),\\ k_{4}&=f\left(t_{n}+h,y_{n}+k_{3}\right). \end{aligned}\) But since the function \(f\) only depends on \(t\) we end up with: \(\begin{aligned} k_{1}&=f(t_{n}),\\ k_{2}&=f\left(t_{n}+\frac{h}{2}\right),\\ k_{3}&=f\left(t_{n}+\frac{h}{2}\right),\\ k_{4}&=f\left(t_{n}+h\right). \end{aligned}\) Thus \(k_{2}=k_{3}\). This reduces: \(y_{n+1}=y_{n}+\tfrac{h}{6}\left(k_{1}+2k_{2}+2k_{3}+k_{4}\right)\) to: \(y_{n+1}=y_{n}+\tfrac{h}{6}\left(k_{1}+4k_{2}+k_{4}\right)\) Compare this to the function evaluations of the 3rdorder method: \(\begin{aligned} k_{1}&=f(t_{n},y_{n}),\\ k_{2}&=f\left(t_{n}+\frac {h}{2},y_{n}+\frac {h}{2}k_{1}\right),\\ k_{3}&=f(t_{n}+h,y_{n}hk_{1}+2hk_{2}). \end{aligned}\) Here we end up with: \(\begin{aligned} k_{1}&=f(t_{n}),\\ k_{2}&=f\left(t_{n}+{\frac {h}{2}}\right),\\ k_{3}&=f(t_{n}+h). \end{aligned}\) \(y_{n+1}=y_{n}+\tfrac{h}{6}\left(k_{1}+4k_{2}+k_{3}\right)\) Therefore we can see that RK4 degrades to RK3. Cheers Thomas 

09272018, 09:03 AM
Post: #6




RE: Side benefit from RungeKutta methods for ODE
Thanks Thomas for the clarification with equations.


09272018, 10:03 PM
(This post was last modified: 09282018 02:41 AM by Namir.)
Post: #7




RE: Side benefit from RungeKutta methods for ODE
Has anyone tried to fake an f(x) into an f(x,y) by doing somethin like this:
f(x,y) = g(x) + y  y where g(x) is the original function of x? :) Namir .... a truly original thinker .... LOL PS: Maybe I am pulling a Ramanujan on this one. He presumably proved that: c = 1 + 2 + 3 + 4 + ... + n = 1/12 In his proof, Ramanujan corrupted the equations he was using in calculating (c  3c) and finagled the corrupted equations into giving 1/12, which of course is very wrong! 

« Next Oldest  Next Newest »

User(s) browsing this thread: 1 Guest(s)