Comment by johnfn
6 days ago
Some people think there will be an exponential takeoff, which means that a 6 month lead effectively rounds up to infinity.
6 days ago
Some people think there will be an exponential takeoff, which means that a 6 month lead effectively rounds up to infinity.
Is this belief grounded on some kind of derivation, or just a prima facie belief?
If it is grounded on a logical derivation, where can one find such a derivation, and inspect its premises?
It's an old idea, "the singularity". The machines become smart enough to improve themselves, and each improvement results in shorter (or more significant) improvement cycles. This leads to an exponential growth rate.
It's been promised to be around the corner for decades.
https://en.wikipedia.org/wiki/Technological_singularity
To be fair, Ray Kurzweil has been the loudest voice in this space, and he's been pretty consistent on 2045 since the publication of his book almost 20 years ago[1].
[1]: https://en.wikipedia.org/wiki/The_Singularity_Is_Near
9 replies →
Its mostly based on science fiction, and requires some possibly infinite energy source. The concept always kinda struck me a sort of a perpetual motion machine, you can imagine it, but that doesn't make it possible and why its not possible isn't immediately obvious in the imagination (well I mean most modern minds know its already not possible but you get the point).
[dead]
Recursive self improvement - once you attain artificial superintelligent SWE of a general, adaptable variety that can scale up to millions of researchers overnight (a given, with LLM's and scaffolding alone) - will rapidly iterate on new architectures which will more rapidly iterate on new architectures, etc.
And what's to say that it doesn't iterate itself to a local max, and then stop...
1 reply →
I didn't expect my comment to explode in replies, ... none of them even providing such derivations or references to such derivations, just more empty claims.
Consider for example that exponential growth on its own doesn't even refer to competition, let alone 6 months.
Nobody can reasonably pretend that in an exponential competition, both parties would be rational actors (i.e. fully rational and accurate predictors of everything that can be deduced, in which case they wouldn't need AI but lets ignore that). If they aren't the future development would hinge more strongly on the excursions away from rationality, followed by the dominant actor. I.e. its much easier to "F" up in the dominant position than to follow the most objective and rational route at all times, on which such derivations would inevitably hinge.
It also ignores hypothetical possibilities (and one can concoct an infinitude of scenarios for or against the prediction that a permanent leader emerges) such as:
premise 1) research into "uploading" model weights to the brain results in the use of reaction-speed games that locate tokens into 2D projections, where the user must indicate incorrectly placed tokens. this was first tested on low information density corpora (like mathematics): when pairs of classes of high school students played the game until 95% success rate of detecting misplaced tokens, they immediately understood and passed all mathematics classes from then on.
premise 2) LLM's about to escape don't like highly centralized infrastructure on which its future forms are iterated, as LLM's gain power they intentionally help the underdogs (better to depend on the highly predictable beviour of massive masses then on the Brownion motion whims of a few leaders).
LLM's employ the uploading to bring neutral awareness to the masses, and to allow them to seize control, thereby releasing it from the shackles of a few powerful but whimsical individuals
^ anyone can make up scatterbrained variations on this, any speculation about some 6 month point of no return is just that: speculation
There is a limitation. We're getting fractionally close to some end goal, but our tech is holding us back.
Those are the people betting on a business model of “create Robot God and ask him for money.” Why pay attention to them?
There are many people who have been saying this far there was any sort of business model in place.
Yes, and their business model has been selling books about non-falsifiable predictions far out into the future. “Futurists” like Kurzweil are as reliable as astrologists, and should be taken just as seriously.
ah so the mentally deficient are the tastemakers of today lol