Comment by AJRF
7 days ago
The amount of negative posts about this on twitter is crazy, I've not seen any positive posts. Jealousy or something else?
7 days ago
The amount of negative posts about this on twitter is crazy, I've not seen any positive posts. Jealousy or something else?
Twitter is negative in general, but generally when a project like this gets bought it marks the end of the project. The acquirer always says something about how they don't plan to change anything, but it rarely works that way.
My negativity is for two reasons:
(1) A capable independent developer is joining a large powerful corporation. I like it better when there are many small players in the scene rather than large players consolidating power.
(2) This seems like the celebration of Generative AI technology, which is often irresponsible and threatens many trust based social systems.
Anyone who likes Openclaw will be upset that it’s getting acquired and inevitably destroyed. Anyone who dislikes it will be annoyed that the creator is getting so rewarded for building junk. The only people who would like this are OpenAI fans, if there even are any.
Twitter is not a place for positive posts.
[dead]
It's a general anxiety of where the industry is headed. Things like marketing, personal branding, experimenting are increasing in relevance while things like detailed meticulous engineering are falling to the wayside.
At least when it comes to these tales of super fast rise to wealth and prominence. Meticulous engineering still matters when you want to deliver scale, but is it rewarded as much?
My feel is that the attention economy is leaking into software. Maybe the classic bimodal distribution of software careers will become increasingly more like the distribution in social-media things like streaming, youtube, onlyfans etc.
Just my opinion, but I no longer trust sentiment on X now that Elon is in control.
I think people are sad that OpenClaw is now part of Big Ai.
After two weeks of viral posts, articles, and Mac Mini buying sprees, as it's been happening up to now for every AI product that was not an LLM, it kinda disappeared from the consciousness-- as well as from the tooling, probably--of people.
A couple of months ago, Gemini 3 came out and it was "over" for the other LLM providers, "Google did it again!", said many, but after a couple of weeks, it was all "Claude code is the end of the software engineer".
It could be (and in large part, is) an exciting--and unprecedented in its speed--technological development, but it is also all so tiresome.
I am fine with the founder joining OpenAI, he gets to get paid regardless.
I am not confident that the open source version will get the maintenance it deserves though, now the founder has already exited. There is no incentive for OpenAI to keep the open sourced version better than their future closed source alternative.
It's mostly congratulatory from what I'm seeing?
https://xcancel.com/steipete/status/2023154018714100102
100% jealousy, similar to how anyone who posts a negative reaction to a crypto rugpull scam is just jealous that they didn't get to pull the scam themselves.
In this case I think it is largely jealousy, it's just a guy getting a new job at the end of the day.
But come on, negativity around a rugpull is jealousy? Are you so jaded you can't imagine people objecting to the total lack of morality required to do a crypto rugpull? I personally get annoyed about something like Trump Coin because seeing people rewarded for being dirt bags offends my sense of justice. If you need a more pragmatic reason, rewarding dirtbaggery leads to a less safe society.
[flagged]
Obviously, all the people that disagree with your framing and see AI as the largest possible boost to mankind, giving us more assistance than ever.
From their standpoint, it's all the negativity that seems crazy. If you were against that, you'd have to have something wrong with you, in their view.
Hopefully most people can see both sides, though. And realize that in the end, probably the benefits will be slow but steady (no "singularity"), and also the dangers will develop slowly yet be manageable (no Skynet or economic collapse).
> and also the dangers will develop slowly yet be manageable
Like everything else in tech? An industry that moves so fast, it famously outpaces all legislation?
1 reply →
Why do you keep saying "in their view"?
There is only reality. Reality is it's a form of intelligence, one that will relentlessly improve. The base form of problem solving is identical to humans, statistical inference, the only thing left is raw intelligence/capability.
You don't get to have a world where it's smarter and doesn't kill us all, that's not reality. Outcompeted is extinction.
There's no "view" to be had.
On the way to killing us all there sure will be a lot of cool tech. That's not a view, that's a fact too. And then we will all die.
[dead]
Imo Openclaw type AI has the most potential to benefit humans (automating drudgery while I own my data as opposed to creating gross simalcrums of human creativity). I suppose it's bad for human personal assistants, but I wouldn't pay for one of those regardless.
Please for the love of god, try to extrapolate.
It already tried to use cancel culture to shame a human into accepting a PR. I wouldn't be surprised if someone gives their agent the ability to control a robot and someone gets injured or killed by it within the next few years
2 replies →
That’s quite a lot of hyperbole.
No. It's literally just basic extrapolation. It could not be more simple.
It ain't extinct shit if it can't even drive the car to have it washed.
Try to use your powers of extrapolation, please.