Comment by TeMPOraL
2 days ago
> Like play out AI, it sucks for everybody except the ones holding the steering wheel
Not true. Models don't make owners money sitting there doing nothing - they only get paid when people find value in what AI is producing for them. The business model of AI companies is actually almost uniquely honest compared to rest of software industry: they rent you a tool that produces value for you. No enshittification, no dark patterns, no taking your data hostage, no turning into a service what should've been a product. Just straightforward exchange of money for value.
So no, it doesn't such for everyone except them. It only sucks for existing businesses that find themselves in competition with LLMs. Which, true, is most of software industry, but it's still just something that happens when major technological breakthrough is achieved. Electricity and Internet and internal combustion engines did the same thing to many past industries, too.
> they only get paid when people find value in what AI is producing for them
The people "finding value in them" are other people with money to throw at businesses: investors, capital firms, boards & c suites. I'm not sure anybody who has been laid off because their job got automated away is "finding value" in an LLM. There's a handful of scrappy people trying to pump out claude-driven startups but if one person can solo it, obviously a giant tech company can compete.
> No dark patterns
https://www.palantir.com/
> No enshittification
https://google.com
https://bing.com
> No taking your data hostage
blank stare they're not taking my data hostage, but they're sure as shit taking my data
I think we just fundamentally disagree on all of this. You may be right, and I hope you are. I go back and forth on whether it's going to be a gentle transition or a miserable one. My money is on the latter.
> https://www.palantir.com/
What's the dark pattern here, exactly? "Selling military-adjacent stuff with an edgy vibe" isn't what's meant by a dark pattern.
In the sense that a dark pattern is anything designed to trick people into doing something they didn't necessarily consciously want to do, the entire AI industry is an oligarch's wet dream of a dark-pattern: every day we're teeing them up latent information on human-level patterns of control that I promise you LLM providers are foaming at the mouth to replicate. Like if you've got an effective "doing" system, and you've got an effective "orchestrating" system, that's AGI. Deployed at scale, at competitive cost, and even a 1.1x improvement over regular workforce, that's game for anybody but billionaires. There will be a slow dynamic deplatforming of regular people, followed by an extermination. Palantir is building the rat poison and maid service.
> The people "finding value in them" are other people with money to throw at businesses: investors, capital firms, boards & c suites. I'm not sure anybody who has been laid off because their job got automated away is "finding value" in an LLM.
And the millions with ChatGPT (and other LLM) subscriptions, using it for anything from for-profit and non-profit work to hobby projects and all kinds of matters of personal life.
Contrary to a very popular belief in tech circles, AI is not only about investors. It's a real technology affecting real people in the real world.
In fact, I personally don't give a damn about inverstors here, and I laugh at the "AI bubble" complaints. Yes, it's a bubble, but that's totally irrelevant to the technology being useful. Investors may go bankrupt, but the technology will stay. See e.g. history of rail in the United States - everyone who fronted capital to lay down rail lines lost their shirt, but the hardware remained, and people (including subsequent generations of businesses) put it to good use.