I did a very very ugly quick hack in python. Took the example JSON, made the one list entry a string (lazy hack), repeated it 56,000 times. That resulted in a JSON doc that weighed in at 10M. My initial guess at 60,000 times was a pure fluke!
Dumped it in to a very simple sqlite db:
$ du -hs gta.db
5.2M gta.db
Even 10MB is peanuts for most of their target audience. Stick it in an sqlite db punted across and they'd cut out all of the parsing time too.
They probably add more entries over time (and maybe update/delete old ones), so you’d have to be careful about keeping the local DB in sync.
So just have the client download the entire DB each time. Can’t be that many megabytes.
I did a very very ugly quick hack in python. Took the example JSON, made the one list entry a string (lazy hack), repeated it 56,000 times. That resulted in a JSON doc that weighed in at 10M. My initial guess at 60,000 times was a pure fluke!
Dumped it in to a very simple sqlite db:
Even 10MB is peanuts for most of their target audience. Stick it in an sqlite db punted across and they'd cut out all of the parsing time too.
I think just using a length encoded serialization format would have made this work reasonably fast.
Or just any properly implemented JSON parser. That's a laughable small amount of JSON, which should easily be parsed in milliseconds.