Comment by trissylegs
5 years ago
I once did get to look an optimsing a bad load time... only because at that point it was crashing due to running out of memory. (32-bit process)
In this case it was a JSON, but using .NET so Newtonsoft is at lease efficient. The issues were many cases of: * Converting string to lowercase to compare them as the keys were case insensitive. (I replaced with a StringComparison.OrderinalCaseInsensitve) * Reduntant Dictionary's * Using Dictionary<Key, Key> as a Set. replaced with HashSet.
The data wasn't meant to be that big, but when a big client was migrated it ended up with 200MB of JSON. (If there data was organised differently it would've been split accross many json blobs instead)
It would also be nice to handle it alls as UTF8 like System.Text.Json does. That would half all the strings saving a fair bit. (I mean the JSON blob it starts with is converted to UTF-16 because .NET)
No comments yet
Contribute on Hacker News ↗