(11-01-2021 10:42 PM)Albert Chan Wrote: (10-26-2021 02:12 AM)Gerson W. Barbosa Wrote: That gives linear convergence ( 25/12 digits per iteration ), as you can see by the Decimal Basic code and output ...

...

With Decimal Basic code, I checked for for digits accuracy, difference to ζ(2) = pi^2/6

Note: digits = 1 - log10(abs(pi^2/6 - x)). So, if x=1, it is 1.1905 digits accurate.

Anyway, we are only interested in differences.

n=100: 210.6882

n=101: 212.7781 → gained 2.0899 digit

n=102: 214.8680 → gained 2.0899 digit

I had noticed n = 478 was enough for 999 decimal digits. Your calculations suggests 1883/901 is a better estimate for the convergence rate. I've replaced line

LET n = CEIL(12*nd/25) with

LET n = CEIL(901*nd/1883). On another computer I tried I get 999 decimal digits in 0.27 seconds (same processor, same clock speed, but a cleaner Windows installation).