← Back to context

Comment by xmprt

3 days ago

You're either overestimating the capabilities of current AI models or underestimating the complexity of building a web browser. There are tons of tiny edge cases and standards to comply with where implementing one standard will break 3 others if not done carefully. AI can't do that right now.

Even if AI will not achieve the ability to perform at this level on its own, it clearly is going to be an enormous force multiplier, allowing highly skilled devs to tackle huge projects more or less on their own.

  • Skilled devs compress, not generate (expand).

    https://www.youtube.com/watch?v=8kUQWuK1L4w

    The "discoverer" of APL tried to express as many problems as he could with his notation. First he found that notation expands and after some more expansion he found that it began shrinking.

    The same goes to Forth, which provides means for a Sequitur-compressed [1] representation of a program.

    [1] https://en.wikipedia.org/wiki/Sequitur_algorithm

    Myself, I always strive to delete some code or replace some code with shorter version. First, to better understand it, second, to return back and read less.

It's most likely both.

> There are tons of tiny edge cases and standards to comply with where implementing one standard will break 3 others if not done carefully. AI can't do that right now.

Firstly the CI is completely broken on every commit, all tests have failed and its and looking closely at the code, it is exactly what you expect for unmaintainable slop.

Having more lines of code is not a good measure of robust software, especially if it does not work.

Not only edge cases and standards, but also tons of performance optimizations.