Post Reply 
[Help]Hp 50g : working with larger-than-memory datasets
03-28-2017, 06:26 AM
Post: #7
RE: [Help]Hp 50g : working with larger-than-memory datasets
(03-28-2017 05:47 AM)cyrille de brébisson Wrote:  So if you only modify small part of the data each time, you would be better of separating your data in constitutive elements, or at least is very small chunks. for example 10 elements at a time in a variable.

eg:
:1:'list_n'
would contain elements 10*n to 10*n+9

Cyrile

Thanks for the tip. I know that the flah are quite easy to worn out with write cycles (I know it from years of using openwrt on very limited hw, then all the smartphones, tablets, etc.).

Of course the size of the writing action has to be "smart" but I was expecting that the port 2 (internal flash) has nevertheless wear leveling algorithms, so one could use it for pretty long time. Am I wrong on this one?

Wikis are great, Contribute :)
Find all posts by this user
Quote this message in a reply
Post Reply 


Messages In This Thread
RE: [Help]Hp 50g : working with larger-than-memory datasets - pier4r - 03-28-2017 06:26 AM



User(s) browsing this thread: 1 Guest(s)