Comment by pjc50
1 year ago
> For my personal side project hobby work it's all in one file.
Ok, so this is work, but:
find . -name '*.cs' -type f -print0 | wc --files0-from=-
1035617 3438912 47446211 total
Not many editors are comfortable with a million line document that's 47MB. And that doesn't include generated code (which I very rarely need to look at, but is right there under F12 if I do)
A 100G file was ok in those editors even with syntax highlighting. That is an extreme because saving did take some time but there are ways to optimize for that would it ever become popular. IMHO 640K per file is enough for everybody.
For Emacs, there’s a package for editing very large files: <http://elpa.gnu.org/packages/vlf.html>
Thanks for the hint about `--files0-from`, BTW. I thought I needed `xargs` for this.
We need tools that process command line output and large files and create tables of contents dynamically based on rules of architecture or comments and allow efficient navigation. I need overlapping buckets or tags.
I've been thinking about it lately but what I've seen in IDEs is not what I want
I liked OpenGrok
Are you willing to do that by hand? Have you heard of the author of _The Art of Computer Programming_ and the technique he created: Literate Programming?