Comment by ben_w
3 days ago
> Could this be an experiment to show how likely LLMs are to lead to AGI, or at least intelligence well beyond our current level?
You'd have to be specific what you mean by AGI: all three letters mean a different thing to different people, and sometimes use the whole means something not present in the letters.
> If you could only give it texts and info and concepts up to Year X, well before Discovery Y, could we then see if it could prompt its way to that discovery?
To a limited degree.
Some developments can come from combining existing ideas and seeing what they imply.
Other things, like everything to do with relativity and quantum mechanics, would have required experiments. I don't think any of the relevant experiments had been done prior to this cut-off date, but I'm not absolutely sure of that.
You might be able to get such an LLM to develop all the maths and geometry for general relativity, and yet find the AI still tells you that the perihelion shift of Mercury is a sign of the planet Vulcan rather than of a curved spacetime: https://en.wikipedia.org/wiki/Vulcan_(hypothetical_planet)
An example of why you need to explain what you mean by AGI is:
https://www.robinsloan.com/winter-garden/agi-is-here/
> You'd have to be specific what you mean by AGI
Well, they obviously can't. AGI is not science, it's religion. It has all the trappings of religion: prophets, sacred texts, origin myth, end-of-days myth and most importantly, a means to escape death. Science? Well, the only measure to "general intelligence" would be to compare to the only one which is the human one but we have absolutely no means by which to describe it. We do not know where to start. This is why you scrape the surface of any AGI definition you only find circular definitions.
And no, the "brain is a computer" is not a scientific description, it's a metaphor.
> And no, the "brain is a computer" is not a scientific description, it's a metaphor.
Disagree. A brain is turing complete, no? Isn't that the definition of a computer? Sure, it may be reductive to say "the brain is just a computer".
Not even close. Turing complete does not apply to the brain plain and simple. That's something to do with algorithms and your brain is not a computer as I have mentioned. It does not store information. It doesn't process information. It just doesn't work that way.
https://aeon.co/essays/your-brain-does-not-process-informati...
6 replies →
probably not actually turing complete right? for one it is not infinite so
> And no, the "brain is a computer" is not a scientific description, it's a metaphor.
I have trouble comprehending this. What is "computer" to you?
Cargo cults are a religion, the things they worship they do not understand, but the planes and the cargo themselves are real.
There's certainly plenty of cargo-culting right now on AI.
Sacred texts, I don't recognise. Yudkowsky's writings? He suggests wearing clown shoes to avoid getting a cult of personality disconnected from the quality of the arguments, if anyone finds his works sacred, they've fundamentally misunderstood him:
- https://en.wikiquote.org/wiki/Eliezer_Yudkowsky
Prophets forecasting the end-of-days, yes, but this too from climate science, from everyone who was preparing for a pandemic before covid and is still trying to prepare for the next one because the wet markets are still around, from economists trying to forecast growth or collapse and what will change any given prediction of the latter into the former, and from the military forces of the world saying which weapon systems they want to buy. It does not make a religion.
A means to escape death, you can have. But it's on a continuum with life extension and anti-aging medicine, which itself is on a continuum with all other medical interventions. To quote myself:
- https://benwheatley.github.io/blog/2025/06/22-13.21.36.html
Basically looking for emergent behavior.