Re: More trig--specifically speed vs accuracy Message #8 Posted by Ex-PPC member on 22 Oct 2001, 2:18 p.m., in response to message #1 by Cameron
Hi, Cameron:
You wrote:
"I managed sine in 94 instructions. To put that in
perspective, the sine routine uses 37 instructions. The
rest are taken up with computing y**x and x!. I could
probably shave two handfulls of instructions off if I
took out the error detection and optimised register
Even so, it would still be a weighty program."
You don't need to emulate y^x nor x! or your HP-16C.
The Taylor Series Expansion for sin(x) is:
sin(x) = x - x^3/3! + x^5/5! - x^7/7! + ...
If you compute this as a sum of N terms, the terms T(n)
are:
T(1) = +x, T(2) = -x^3/3!, T(3) = +x^5/5!,
T(4) = -x^7/7!, etc.
and it seems necessary to have both an y^x and an x!
routines. But you can compute each term T(n) as a function
of the previous one, using this simple relation:
T(n+1) = T(n) * [ -x*x/((2n-1)*(2n-2)) ]
So, for instance: T(4) = T(3)*[ -x*x/(7*6)]
= -x^5/5 * (-x^2/(6*7)) = +x^7/7!
as it should. Using this relation, you can simply
keep the previous term T(n) somewhere, be it the stack
or a register, then compute the next term, T(n+1) by
simply multiplying it times -x*x [which you'd do well to
compute at first and store somewhere to avoid recomputing
it in each iteration], then dividing by 2n-1 and then
again by 2n-2. Once you've got this new term, T(n+1),
add it to the ongoing sum, and keep it as the new term
from which to compute the next.
That way you don't need y^x nor x!, and your
program will much faster, and even shorter. With a little
persevering, you should get a 20 step sine routine easily,
and even shorter is possible. :-)
Hope this helps ...
|