Comment by JanisErdmanis
7 hours ago
I still firmly believe that one ctx object and hundred functions/methods is as bad as programming with plain variables defined in the global scope. If the ctx is composed from smaller data structures with whom the functions are defined, then all is good. This is the opposite of the rule.
But why?
You keep saying you believe it, but that is literally what a database is, game state manipulation, string manipulation, iterator algorithms, list comprehensions, range algorithms, image manipulations, etc. These are all instances where you use the same data structures over and over with as many algorithms and functions and you need.
It’s about coupling and being able to maintain that in the long term. A narrow focus helps to test each individual unit in isolation from each other. It is true that a database appears to be a single datastructure with hundreds of methods from the users perspective and that is fine, because someone else engineered and tested it for you. However if you were to look into how a database is implemented you would get to see the composition of data structures, like btrees that are tested in isolation.
It’s about coupling and being able to maintain that in the long term.
What does that mean? This is all the kind of abstract programming advice that sounds nice until someone needs an example.
A narrow focus helps to test each individual unit in isolation from each other.
A function operating on a data structure is already a narrow focus.
It is true that a database appears to be a single datastructure with hundreds of methods from the users perspective
And also from a reality perspective because it's literally what a database is about.
However if you were to look into how a database is implemented you would get to see the composition of data structures, like btrees that are tested in isolation.
I don't know what point you're trying to make. Data structures should be tested? I don't think anyone is saying they shouldn't.