HP Forums

You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2
Inspired by my current problem with the hp 50g I wonder what is your experience.

For computation I mean some solved problem through the calculator that was valuable to you, for your work, passion, curiosity, whatever.

I do remember that normally the problems that I solve (and solved pre 2013) with the hp 50g last normally some minutes in the average case (numeric solver, graphing), and max an hour for some userRPL program or library..

Once I tried an extended version of the savage benchmark (not something really useful for me), and after 3 days it was not finished. Then I plugged the plug because well it was not worth it (and poor batteries once again).

So actually it is the first time that the hp 50g is crunching for 20 hours straight for something that I actually care and I do not know if it will finish in the next minute.

With the sharp el506w the longest computation was around some minutes for an integral (numerical integration with very small steps) and with the casio algebra fx7400 plus it was half an hour for a non trivial graph, graphing on that machine was quite slow.

Then I also have a ti 89 that a fellow in the university decided to give it to me because "you are always playing around with the hp 50g (note 1), while my ti 89 is sitting in a drawer, would you like to have it?", that was pretty a nice memory. But since I did not finish burning my 50g the Ti89 saw very little usage. I know that it can do a lot though, especially with the large codebase on ticalc.org .

(note 1) at that time I discovered - a knowledge at the moment lost for me - how to export equations from the hp 50g instead of typing them in mathML - that is a pretty neat system - in word 2007. I typed the formulas in the equation editor and they were (and are) just beautiful, then somehow I produced a large enough GROB from those formulas, moved to the pc and converted to a picture (png/jpg). They fit neatly in some digital notes that I shared and it was way faster than before (doing everything with word and visio).
30+ years ago I needed to figure out the ringing time constant of a circuit with inductors, capacitors, and resistors, and didn't have the math background to do it the fast way, so I wrote a program for my TI-59 to do it kind of like a Simpson's approximation of an integral, chopping the ringing waveform up into tiny slivers and simulating the behavior of each of the components as they worked together. It ran all night—or maybe 24 hours. I think I did it a few times with different step sizes to make sure I had really converged on a valid answer. I'm not nearly as strong in math as some on this forum are, but I can always find a way to brute-force something to get the answers I need.

A few years ago I used my HP-71, in BASIC, to generate large look-up tables (most being 65,536 cells each, and 16-bit, meaning 128KB per table) for hyperfast, accurate, 16-bit scaled-integer math, to use as a virtual co-processor for hardware that didn't have a normal co-processor. It took my HP-71 a few weeks to generate the tables and convert to Intel Hex files. I posted them, along with a discussion of the value of scaled-integer math and how to use it effectively, and information on how each table was calculated, on my website, at http://wilsonminesco.com/16bitMathTables/ .
(03-30-2017 07:30 PM)Garth Wilson Wrote: [ -> ]I'm not nearly as strong in math as some on this forum are, but I can always find a way to brute-force something to get the answers I need.
Same here, but I fail nevertheless . Therefore I try again, and again, and again, and...

Quote:A few years ago I used my HP-71, in BASIC, to generate large look-up tables (most being 65,536 cells each, and 16-bit, meaning 128KB per table) for hyperfast, accurate, 16-bit scaled-integer math, to use as a virtual co-processor for hardware that didn't have a normal co-processor. It took my HP-71 a few weeks to generate the tables and convert to Intel Hex files. I posted them, along with a discussion of the value of scaled-integer math and how to use it effectively, and information on how each table was calculated, on my website, at http://wilsonminesco.com/16bitMathTables/ .

This looks interesting. Is scaled-integer math the same of as (thanks for the correction!) fixed point math? I will check the link, thanks for sharing!
(03-30-2017 07:37 PM)pier4r Wrote: [ -> ]
(03-30-2017 07:30 PM)Garth Wilson Wrote: [ -> ]A few years ago I used my HP-71, in BASIC, to generate large look-up tables (most being 65,536 cells each, and 16-bit, meaning 128KB per table) for hyperfast, accurate, 16-bit scaled-integer math, to use as a virtual co-processor for hardware that didn't have a normal co-processor. It took my HP-71 a few weeks to generate the tables and convert to Intel Hex files. I posted them, along with a discussion of the value of scaled-integer math and how to use it effectively, and information on how each table was calculated, on my website, at http://wilsonminesco.com/16bitMathTables/ .

This looks interesting. Is scaled-integer math the same [as] fixed point math? I will check the link, thanks for sharing!

Fixed-point is a limited subset of the broader, more-flexible scaled-integer math. "Fixed" moves the point right or left one in increments of one or more digits at a time, while "scaled" can, in essence, put it within a digit, basically infinitely variable rather than in digit steps.

For example, a 16-bit integer representing 0-360° in fixed-point can go up to 36,000 (for .01° resolution) using base 10, or perhaps even 46,080 (for 1°/128); but then you have the problem of carry/borrow/rollover. But if you scale it so the whole circle is 65,536, it's like the decimal point is, in essence, somewhere in a binary digit, and you get a resolution of 1°/182.044, and the rollover works as it should, so 359°+2°=1°, or 1°-2°=359°. Starting at zero degrees/radians and adding 3π/2 Radians (ie, 270°) puts you in the same place as subtracting π/2 Radians (ie, 90°). The high bit always tells which half of the circle you're in. The odd scale factor remains throughout the calculations until it's time for human-readable output, then you multiply or divide by the appropriate constant. If the output is in the form of graphics rather than printed digits, the number never may never need to exist in any format where the point is tucked neatly between digits in any base. Depending on the size of image, you might multiply by yet another odd scale factor which will change as you change the size of the window.

Or suppose you have a 0-7V range you want to measure with as much precision as possible with a 16-bit A/D converter and represent in memory with 16 bits. You might set the gain of the amplifier ahead of it so 7V gives an output of 65,535, meaning you get 9,362.3 counts per volt, 9.3623 counts per mV, or 106.81µV per count. The odd scale factor remains throughout the calculations until it's time for human-readable output, then you multiply or divide by the appropriate constant. Actually, my workbench computer has a fast 8-bit A/D converter that's 0-5V, or actually 0-255/256*(5.08V) since the regulator is wee bit off; so one count is 19.84mV. A common reference for the converters is 4.096V so that the user can be lazy and used fixed-point rather than scaled-integer, so you get 1mV per count if it's a 12-bit converter, 4mV per count if it's a 10-bit converter, 16mV per count if it's an 8-bit converter, etc., on even boundaries of binary digits.
Nice input!

Actually scaled-integers math is then what I use for probability sometimes.

Instead of having [0,1] I say that the interval is , say, [0, 1000] in integers values. Then I scale it back later. Quite interesting, for many applications could quite speed up the results coprocessors or not (AFAIK, coprocessors for floating point are never as fast as the ALU for integers)
(03-30-2017 08:36 PM)Garth Wilson Wrote: [ -> ]-

Damn I got lost on your site. Nice articles. I like the static simple pages but full of content instead of a fancy site barely readable (common in those days).
I ended up following your biography on the 6502.org forum, how much activity there!

I also shared the article on reddit, because it is a nice introduction.
For me it is 20 minutes

I was in the early physics courses in the university. The exam's problem was to determine the best angle to shoot something trying to hit a precise spot having some simple external conditions. As I usually did not went to any non-programming related classes I had no idea of how to solve this.

However I had my excellent, paint-faded, keys-broken 49g+ with me, including the (optional) equation library to get the equations and then coded an ugly FOR loop to iterate in a range of angles I thought was optimal. Looking back it was probably the worse thing I could do, but I continued with the other problems and after a long while I had the answer... too bad that I had to explain to the teacher why my answer was like 40.022 instead of the expected 40 and I did not had the normal explanation either. But I got half a point for the approach
(03-31-2017 07:07 AM)pier4r Wrote: [ -> ]
(03-30-2017 08:36 PM)Garth Wilson Wrote: [ -> ]-

Damn I got lost on your site. Nice articles. I like the static simple pages but full of content instead of a fancy site barely readable (common in those days).
I ended up following your biography on the 6502.org forum, how much activity there!

I also shared the article on reddit, because it is a nice introduction.

Thankyou! I've heard of reddit, but I'm not familiar with it. My site gets 500-1000 page downloads a day. There would be even a lot more material there if it didn't all take so much time. Any article takes a lot of time, but especially features like the stacks treatise which is actually 19 logically organized articles plus appendices. I have several more major features planned, but this year I'm taking some time off from writing to build more stuff. (Small edits however happen all the time.) I hope you found that intro to be entertaining. Did you see my slide-rule page too? The reason the site is the way you see it is briefly explained at http://wilsonminesco.com/whydothis.html . A linked index of 77 of my own articles is at http://wilsonminesco.com/GarthArticleLinks.html .

now back to our regularly scheduled programming...
(03-31-2017 08:35 AM)Garth Wilson Wrote: [ -> ]Thankyou! I've heard of reddit, but I'm not familiar with it.

Yes the slide rule is the 3rd article that I read, and of course those articles takes time. Quality takes time in general. If you were writing spam, it would take less time. Anyway since those articles takes time and shows quality, the site is nice!

Reddit is like a social network (text based), mostly focused on the last post. it is not so great for archives but it is interesting because every community can spin a subreddit (like a sub discussion place). So there are community that are focused on very interesting things (check askhistorians), although I prefer forums for quality posting and ability to extend the discussion.

(note that, thanks to social networks with like/dislike like reddit, youtube and so on, one could statistically confirm that there is no way to make everyone happy. The % of upvotes is rarely hitting 100 for posts with more than 5 score, so please do not be annoyed by the fact that the post is not 100% upvoted)

@eried: well your approach is very similar to mine in the "little explorations" thread. I do not think it is bad, at least to get a first idea of the problem. (then one may optimize and use a closed formula, if it is known)
An interesting long computation is the described in the famous article "Long Live HP-15C !", which finds the number "e" with 208 digits (206 correct), in 62 minutes.
Many years ago I wrote Life for the TI-59, as I can remeber it took about 30/40 minutes per generation for a 10x10 grid. The same program for the Casio FX-602P took about 5/10 minutes per generation.
I finished a game of chess against my 28S.
It was taking the calculator well over an hour a move.

Pauli
(03-31-2017 12:43 PM)hibiki Wrote: [ -> ]An interesting long computation is the described in the famous article "Long Live HP-15C !", which finds the number "e" with 208 digits (206 correct), in 62 minutes.

Calculating e to as many digits as memory can hold was one of the first things I did with my then new 41C (+Quad RAM). I do not recall the exact time but the program ran from one evening till the next morning.

Dieter
(04-01-2017 01:02 AM)Paul Dale Wrote: [ -> ]I finished a game of chess against my 28S.
It was taking the calculator well over an hour a move.

Pauli

Interesting and super small! I wonder if the program plays reasonable chess -not only valid moves but also useful ones - or just moves around
It does a full width three ply minimax search with alpha-beta pruning. It has no quiescence look ahead and the scoring function is fairly simple.

It does better than just moving pieces around but isn't a challenge for anyone who's played much chess. Possibly around the level of personal computers in the mid to late 1970's, although probably not competitive with the better ones.

Pauli
It might be interesting to see how a modern/fast RPL machine deals with the chess program. newRPL perhaps?

The program includes a graphical representation of the board and pieces -- quite an effort on the 28S screen. Each piece's graphic had to fit into a 3x3 pixel grid to allow a pixel gap between squares.

I believe my program was ported to the 48S with improved graphics.

Pauli
(04-01-2017 07:07 AM)Paul Dale Wrote: [ -> ]I believe my program was ported to the 48S with improved graphics.

http://www.hpcalc.org/details/785

There's also Peter Österlund's MLChess 1.14:

http://www.hpcalc.org/details/3067

The author says he has used your graphics.

And here is the final move of Pale Blue vs. G. W. Barbosa :-)

89235.7408 seconds or 24h 47m 15.7s to compute $$\pi$$ to 707 decimal places on an HP 50g. Most of this time, probably more than 90%, due to the slow processing of long algebraic expression in exact mode, though.