Rounding Error Worse in Binary or Decimal Representation?

01282015, 01:51 PM
Post: #1




Rounding Error Worse in Binary or Decimal Representation?
This thread
http://www.hpmuseum.org/forum/thread2583.html raises the question of reducing rounding errors by using binary instead of decimal representation or vice versa. My view is that the as the number of primes included in the factorization of the number base increases, rounding errors will be reduced or avoided. Accordingly, 2*5 as base should perform better than 2. Both would be majorized by 2*3*5, & so on. Or am I way off course? 

« Next Oldest  Next Newest »

Messages In This Thread 
Rounding Error Worse in Binary or Decimal Representation?  Gerald H  01282015 01:51 PM
RE: Rounding Error Worse in Binary or Decimal Representation?  Claudio L.  01282015, 02:26 PM

User(s) browsing this thread: 1 Guest(s)