Comment by citizenpaul
4 days ago
Hmm. Perhaps I've just never encountered a hairy enough situation with them? That's what the eternal thought tracker notepad on my desk is for though. Maybe people are trying to do it all in their head? Pen and paper are too old school for the cool new 1000x devs?
I still feel like this argument could be transferred to nearly any concept in CS though. Abstract enough anywhere and you will always start exceeding the brains working memory.
A simple properly implemented doubly linked list or circular buffer is already above the level of most beginner C programmers. Though they're great exercises.
I don't think I would be comfortable saying I understand something if I'm not able to get it 100% clearly just from my thoughts and re-explain it to someone
Everything is just numbers, then we pretend they are arrays, pointers, objects, classes, floats, websites, applications, certificates etc.. The imaginary array can really only contain numbers but we can pretend the numbers are other things, unicorns, rainbows, laser unicorns etc
We are just pretending, there is nothing to understand?
To take this a step further.
They aren't even numbers. They're voltage-high and voltage-low signals.
Numbers don't even exist! You'll never find a 2 in nature. You'll find two things, but you'll never find the 2 itself.
And all 2s are the same 2. But every voltage signal representing a 2 is a completely different voltage signal. Sometimes they aren't even voltage signals! Sometimes they're magnetic flux signals! Sometimes they're electrical field signals! Sometimes they're photons modulated down a wire made of glass!
But the 2 they represent? Not even that is 2. It's 10!
Right, I couldn't find the words. Numbers are also imaginary technology but if you treat high as on and on as 1 and low as off and off as 0 it's so close to reality that it will be hard to find exceptions in software.
Like we pretend it is high for convenience while we really mean higher. For all practical purposes our imaginary world works! hurray!
I can accept that everything else is fake, but UNICORNS ARE REAL, dammit!
That's what most people do. Draw diagrams. Some things like pointer arithmetic which are language features and not just arrows to things and indirection are easy to get wrong though