Post Reply 
[Help]Hp 50g : working with larger-than-memory datasets
03-26-2017, 11:11 AM (This post was last modified: 03-26-2017 11:13 AM by pier4r.)
Post: #1
[Help]Hp 50g : working with larger-than-memory datasets
I apologize if this problem was already discussed and solved, in that case a link to the discussion would be appreciated since the only one that I found with a brief search is this.

The use case:
- Imagine to have a list of numbers (or rows containing more than one value) that cannot be saved in the memory (~230kb on the hp 50g). Or it can be saved in the memory but it would leave few KB available for the computations.

The problem:
- At least on the hp50g, a list like this would not be a big problem to store. One has the internal flash available (Port2 on my 50g has still 600kb+ of memory free) or the sd card (mine has still 2gb +). The problem is to use this list for computations, since it cannot fit the ram.

A possible solution:
- I remember I have read something about "packing" data in blocks to work with smaller blocks every time that would fit the ram leaving enough space for computations. This was a technique used for games on older game consoles like PS1, gameboy, nes, that nevertheless are computing systems (also quite powerful).

Now I do not know if a similar technique was developed for the hp 50g (or calculators using useRPL/sysRPL ) to split data in smaller blocks before feeding it to a program "on the fly". More in detail, I do not know if I save, say, a list on the FLASH I can access only part of it, without splitting it before the computation in smaller lists.

So what I'm searching for is, in the best case, something like this
Code:

- Given list X stored on Port2 or Port3 on the 50g
- read only elements from N to N+K of list X, 
  without loading the entire X in ram, 
  but only the read elements and the list structure.

In the second best case I'm searching for something like
Code:

- Given list X stored on Port2 or Port3 on the 50g
- split the list X in smaller list X1,X2,....,XN of size max S, 
  then feeding one after another in the main program in a "transparent way". 
  (Like it is a unique list)

(I do deem this unlikely to be possible, without modifying the main program)

If those methods/programs do not exists, of course it will be a pleasure for me to try to make little programs implementing them.

One could say: why don't you use a computer that has more capabilities?

That's true, but due to the math library already present (and partially known by me) on the 50g, implementing a solution is way faster than searching for proper math library for computer programming languages (dunno SciPy or equivalents for Java, C#, PHP, Lua and what not), learning to use them, debug the implementation, etc. While the 50g is way more handy for this with built in functions and well documented additional libraries.
This also because the computation time is not a big deal, at least not yet.

Then there is a second reason: I don't see why the 50g cannot sweat making use of the Port2/3 on larger-than-memory datasets. Sure, with the emulation layer the 50g works a lot already for whatever action, but with the newRPL coming (great project!) it will be less work. Anyway even with the current emulation layer, it can work more and I don't feel that it lays around doing a little, but I feel that the 50g earns his period of little usage after working for long time.

Wikis are great, Contribute :)
Find all posts by this user
Quote this message in a reply
03-26-2017, 01:57 PM
Post: #2
RE: [Help]Hp 50g : working with larger-than-memory datasets
What you want would require low-level assembly language programming (assuming you want to use the emulated Saturn CPU). Depending on whether your data has lots of repetition, you should use either arrays or linked arrays. In order to use the data without creating a second copy of the entire set in RAM, you would need some assembly code that handles the bankswitching that will be needed in order to temporarily view the data and copy what is needed into RAM.

Graph 3D | QPI | SolveSys
Find all posts by this user
Quote this message in a reply
03-26-2017, 06:34 PM
Post: #3
RE: [Help]Hp 50g : working with larger-than-memory datasets
(03-26-2017 01:57 PM)Han Wrote:  What you want would require low-level assembly language programming (assuming you want to use the emulated Saturn CPU). Depending on whether your data has lots of repetition, you should use either arrays or linked arrays. In order to use the data without creating a second copy of the entire set in RAM, you would need some assembly code that handles the bankswitching that will be needed in order to temporarily view the data and copy what is needed into RAM.

Thanks for replying and thanks for the direction. So I assume that no one worked out this already, at least not in the public domain. Then I will see what I can do (I think my solution would be way less advanced that the one proposed).

Wikis are great, Contribute :)
Find all posts by this user
Quote this message in a reply
03-27-2017, 12:21 AM
Post: #4
RE: [Help]Hp 50g : working with larger-than-memory datasets
(03-26-2017 11:11 AM)pier4r Wrote:  The problem:
- At least on the hp50g, a list like this would not be a big problem to store. One has the internal flash available (Port2 on my 50g has still 600kb+ of memory free) or the sd card (mine has still 2gb +). The problem is to use this list for computations, since it cannot fit the ram.

A possible solution:
- I remember I have read something about "packing" data in blocks to work with smaller blocks every time that would fit the ram leaving enough space for computations. This was a technique used for games on older game consoles like PS1, gameboy, nes, that nevertheless are computing systems (also quite powerful).

Another possible solution:
Store your data on a file in the SD card. Then write in C a routine for GET and a routine for PUT that work directly on the SD card (you'd never read the whole file into memory, just the objects you need, and you do so in separate objects, so they can be individually garbage collected.
These are 2 individual programs you can call from your RPL program, that would be the only C you need, the rest is RPL.

(03-26-2017 11:11 AM)pier4r Wrote:  ... but with the newRPL coming (great project!) it will be less work.

As a matter of fact, newRPL has a complete set of SD card file manipulation commands. You can read/write arbitrary objects from files on the SD card, it even has a MODIFY file mode where you can overwrite parts of a file without truncating it, useful if you store records of fixed length, you can literally "edit" the file without ever loading it into memory.
That's with RPL only.
Writing a VMM (Virtual Memory Manager) has always been in the back of my head, to use the SD card as swapping. However, I never coded a single line to make it happen.
Find all posts by this user
Quote this message in a reply
03-27-2017, 09:44 AM (This post was last modified: 03-27-2017 09:44 AM by pier4r.)
Post: #5
RE: [Help]Hp 50g : working with larger-than-memory datasets
Claudio even more amazing work you are making with hpgcc and related works. I thought was already great to have hpgcc2, even if discontinued, but instead you set every time a new record.

It is a pity that such awesome work has the recognition of a little community.

And thanks for the suggestion about the C solution. I always wanted to use hpgcc on the hp50g, but then I was torn between using C and learning more about the math library provided with the hp50g and I ended up doing the latter. But I may start soon with C too. (I have to say my only experience in C, better C++, was a modification of emule. So I'm a bit rusty there, but for the basics it should not be a serious problem)

Wikis are great, Contribute :)
Find all posts by this user
Quote this message in a reply
03-28-2017, 05:47 AM
Post: #6
RE: [Help]Hp 50g : working with larger-than-memory datasets
Hello,

Assuming that you are using RPL or system RPL and the onboard flash, here are some advices:

Assuming that you modify small, random, parts of the data at once. You need to be mindfull of the flash limitations. Flash is a read, write, erase type of media where the read and write are byte oriented, but the erase is block oriented (64KB in this case). So if you only modify small part of the data each time, you would be better of separating your data in constitutive elements, or at least is very small chunks. for example 10 elements at a time in a variable.

eg:
:1:'list_n'
would contain elements 10*n to 10*n+9

Cyrile

Although I work for the HP calculator group, the views and opinions I post here are my own. I do not speak for HP.
Find all posts by this user
Quote this message in a reply
03-28-2017, 06:26 AM
Post: #7
RE: [Help]Hp 50g : working with larger-than-memory datasets
(03-28-2017 05:47 AM)cyrille de brébisson Wrote:  So if you only modify small part of the data each time, you would be better of separating your data in constitutive elements, or at least is very small chunks. for example 10 elements at a time in a variable.

eg:
:1:'list_n'
would contain elements 10*n to 10*n+9

Cyrile

Thanks for the tip. I know that the flah are quite easy to worn out with write cycles (I know it from years of using openwrt on very limited hw, then all the smartphones, tablets, etc.).

Of course the size of the writing action has to be "smart" but I was expecting that the port 2 (internal flash) has nevertheless wear leveling algorithms, so one could use it for pretty long time. Am I wrong on this one?

Wikis are great, Contribute :)
Find all posts by this user
Quote this message in a reply
03-29-2017, 05:42 AM
Post: #8
RE: [Help]Hp 50g : working with larger-than-memory datasets
Hello,

Port 2 does have wear leveling algo, BUT if you keep rewriting (small parts of) large objects, then you will wear things much faster than if you only rewrite (small parts of) small objects...

Cyrille

Although I work for the HP calculator group, the views and opinions I post here are my own. I do not speak for HP.
Find all posts by this user
Quote this message in a reply
Post Reply 




User(s) browsing this thread: 1 Guest(s)