Comment by ben_w
12 hours ago
> robots inevitably are going to do the majority of work - there’s no real doubt about it is there?
There's a lot of doubt that the AI and compute to enable that would happen on commercially relevant timescales.
Consider: "do the majority of work" is a strict superset of "get into car and drive it". The power envelope available for an android is much smaller than a car, and the recently observed rate of improvements for compute hardware efficiency says this will take 16-18 years to bridge that gap; that plus algorithmic efficiency improvements still requires a decade between "car that can drive itself" and "android that can drive a car". (For any given standard of driving).
And that's a decade gap even if it only had to do drive a car and no other labour.
You can't get around this (for an economy-wide significant number of androids) by moving the compute to a box plugged into the mains, for the same reason everyone's current getting upset about the effect of data centres on their electricity bills.
And note that I'm talking about a gap between them, not a time from today. Tesla's car-driving AI still has safety drivers/the owner in the driving seat, it is not a level 4 system. For all that there are anecdotes about certain routes and locations where it works well, there's a lot of others where it fails.
That said: Remote control units without much AI are still economically useful, e.g. a factory in Texas is staffed entirely by robots operated over a Starlink connection by a much cheaper team in Nairobi.
I appreciate you engaging, but I'm not sure power how would be the limiting factor. Assuming an average of 1kW of compute needed per robot (for reference, Tesla's AI4 is ~200W, rumors say 800W for AI5, nvidia B200 is ~1kW), that's nothing compared to the amount of energy we use for locomotion (a car eats like 20kW at 60mph).
> Assuming an average of 1kW of compute needed per robot
1kW would be hell on the battery, and at the same time make the robot a space heater even while standing still which in turn creates new problems if you want to replace all labour with them.
Further, to my point about moving the compute out of the machine and mains-powering them, the current global electricity supply and demand is about 350W/person. We're currently already using all of that, including for industrial purposes.
To see the effect of demand exceeding supply, observe that the data centres were starting to cause local problems with only 4-5% of the USA's national power use.
Even if the current literally-exponential growth of each of PV and wind continues, it doesn't change my timelines: even with 31% per year compounding growth for PV, and given what we're doing with it already even without androids, it takes a sufficiently long time to build out sufficient electricity for androids that we're not likely to have enough spare electricity to run an economically relevant number of them (say, equivalent to 10% of the current labour force) before we improve both the compute hardware and the algorithmic efficiency of the software running on it.