Comment by handoflixue

12 days ago

> For example, you can reliably train an LLM to produce accurate output of assembly code that can fit into a context window. However, lets say you give it a Terabyte of assembly code - it won't be able to produce correct output as it will run out of context.

Fascinating reasoning. Should we conclude that humans are also incapable of intelligence? I don't know any human who can fit a terabyte of assembly into their context window.

Any human who would try to do this is probably a special case. A reasonable person would break it down into sub-problems and create interfaces to glue them back together...a reasonable AI might do that as well.

  • I can tell you from first hand experience that claude+ghidra mcp is very good at understanding firmware, labeling functions, finding buffer overflows, patching in custom functionality

On the other hand the average human has a context window of 2.5 petabytes that's streaming inference 24/7 while consuming the energy equivalent of a couple sandwiches per day. Oh and can actually remember things.

  • Citation desperately needed? Last I checked, humans could not hold the entirety of Wikipedia in working memory, and that's a mere 24 GB. Our GPU might handle "2.5 petabytes" but we're not writing all that to disc - in fact, most people have terrible memory of basically everything they see and do. A one-trick visual-processing pony is hardly proof of intelligence.

    • I think the idea is that we may not store 2.5 petabytes of facts like wikipedia. But we do store a ton of “data” in the form of innate knowledge, memories, etc.

      I don’t think human memory/intelligence maps cleanly to computer terms though.