We take for granted many calculations in every field, everyday, like they're some sort of constant, of standard pratice. In real life, it always boggled me how many different implementations are there and how those different implementations lead to very different results. Just like Matlab in the early 2000s had huge precision problems with simple matrix calculations(we're not talking about edge cases), but nobody rechecked any calculation. "I used program X to do Y, that is the result therefore it must be correct.I didn't check

how X

does Y". Even engineers rarely care these days.

For example, with signals we usually take resampling for granted and never ask which algorithm is used, or worse, we botch up two lines that will apparently do the job and call it a day. Here is a comparison of very common audio resampling algorithms, where the differences are quite visibile:

https://src.infinitewave.ca/
As another example, when we buy an music album from our favourite artist we never ask ourselves if that editions has a numerical loss of music data and a terrible dynamic range:

https://dr.loudness-war.info/
I didn't check how X does Y". Even engineers rarely care these days.
I think it would be very expensive. Yes, few people are prone to check everything (lots of calculators/computer/spreadsheet users trust the calculator as well), but if you cannot trust the tools then a job will take a long time. Of course in some edge cases it can cause big problems to not check the reliability of the tools (IIRC in electronics there are calibration labs exactly for this reason).

Then if someone finds a problem, even people that will never encounter that problem will complain and wish to have the tool replaced. See

Pentium FDIV bug, where tons of people may have never noticed the problem, but wanted a new CPU.

(08-21-2023 04:30 PM)KeithB Wrote: [ -> ]Here is a good example from the Prime:

https://www.hpmuseum.org/forum/thread-20349.html

I am not sure what is going on in Home mode, but the answer was close enough that most people would not question it.

I think this is a problem that exists with most tools, when you try to exceed their designed accuracy.

In this particular case cas is more accurate and lots of people are using prime in cas mode exclusively but cas is not allowed during exams, hence the option to use home.

In addition to that, all software has bugs. This also applies to excel.

The only way forward is by iterative improvements and potentially the regular recalibration of tools against a reference which is good practice anyway.

If you calibrate your instruments regularly you know exactly where you stand.

(08-22-2023 05:16 AM)nickapos Wrote: [ -> ] (08-21-2023 04:30 PM)KeithB Wrote: [ -> ]Here is a good example from the Prime:

https://www.hpmuseum.org/forum/thread-20349.html

I am not sure what is going on in Home mode, but the answer was close enough that most people would not question it.

I think this is a problem that exists with most tools, when you try to exceed their designed accuracy.

...

I found it interesting the most of the recent Casio and TI scientific non-graphing calculators that I tested with came up with the exact same answer to the above referenced quadratic regression problem as Excel did. So far only the HP Prime in Home mode had less-accurate answers. It appears that the Casio's I tried have a resolution of 15-significant digits. The TI's have either 13 or 14 digits. The HP Prime has 12 digits, the same as the Saturn based HPs. I also found it interesting that the HP-48/49/50 series did not have the option to calculate quadratic regression built-in to their feature set. TI graphing models have had this ability since the TI-82 introduced in 1993.

(08-22-2023 06:16 AM)Steve Simpkin Wrote: [ -> ] (08-22-2023 05:16 AM)nickapos Wrote: [ -> ]I think this is a problem that exists with most tools, when you try to exceed their designed accuracy.

...

I found it interesting the most of the recent Casio and TI scientific non-graphing calculators that I tested with came up with the exact same answer to the above referenced quadratic regression problem as Excel did. So far only the HP Prime in Home mode had less-accurate answers. It appears that the Casio's I tried have a resolution of 15-significant digits. The TI's have either 13 or 14 digits. The HP Prime has 12 digits, the same as the Saturn based HPs. I also found it interesting that the HP-48/49/50 series did not have the option to calculate quadratic regression built-in to their feature set. TI graphing models have had this ability since the TI-82 introduced in 1993.

Yeah if i understand the evolution correctly, home of HP Prime is a direct descendant of older HP calculators (although i am not a historian to point out the exact models, i would assume HP 39 etc) and the CAS has a superior engine in almost any way developed independently.

I remember Tim Wessman mentioning in a post that Prime is superior in almost every way from HP 50g, and I think he had this in mind. Prime CAS is superior than 50g cas.

In any case it is good to know your tools and their limits.

(08-22-2023 08:31 PM)pier4r Wrote: [ -> ]Regarding "know the limit of your tools" , "how X does Y" and reliance on Excel (and similar software); humany ends up with a lot of untestable science (and thus science that is less scientific) or lots of business waste.

Biology and Excel: https://genomebiology.biomedcentral.com/...016-1044-7

Excel messing up if the user is not attentive: https://www.youtube.com/watch?v=yb2zkxHDfUE

Enron's errors in Excel: https://figshare.com/articles/journal_co...is/1222882

and surely there are more.

Yeah, for years I used to work in the banking sector. I never treated excel as a serious package. I always had a calculator at hand to double check results (a sharp back then) and we always used certified banking software to calculate the official numbers.

The examples where excel failed to produce accurate results are just too many.