Comment by intended
3 days ago
It is extreme, and on the vendor side. The OpenAI non profit vs profit saga, was about profit seeking vs the future of humanity. People are talking about programming 3.0.
I can appreciate that it’s other users who are saying it’s wrong, but that doesn’t escape the point on ignoring the context.
Moreover, it’s unhelpful communication. Its gives up acknowledging a mutually shared context, the natural confusion that would arise from the ambiguous, high level hype, and the actual down to earth reality.
Even if you have found a way to make it work, having someone understand your workflow can’t happen without connecting the dots between their frame of reference and yours.
It really is, for example here is a quote from AI 2027:
> By early 2030, the robot economy has filled up the old SEZs, the new SEZs, and large parts of the ocean. The only place left to go is the human-controlled areas. [...]
> The new decade dawns with Consensus-1’s robot servitors spreading throughout the solar system. By 2035, trillions of tons of planetary material have been launched into space and turned into rings of satellites orbiting the sun. The surface of the Earth has been reshaped into Agent-4’s version of utopia: datacenters, laboratories, particle colliders, and many other wondrous constructions doing enormously successful and impressive research.
This scenario prediction, which is co-authored by a former OpenAI researcher (now at Future of Humanity Institute), received almost 1 thousand upvotes here on HN and the attention of the NYT and other large media outlets.
If you read that and still don't believe the AI hype is _extreme_ then I really don't know what else to tell you.
--
https://news.ycombinator.com/item?id=43571851