← Back to context

Comment by dgfitz

1 day ago

Nah, it’s not just you.

AI is really neat. I don’t understand how a business model that makes money pops out on the other end.

At least crypto cashed out on NFTs for a while.

> I don’t understand how a business model that makes money pops out on the other end

Tractors and farming.

By turning what is traditionally a labour intensive product into a capital intensive one.

For now, the farmers who own tractors will beat the farmers who need to hire, house and retain workers (or half a dozen children).

This goes well for quite some time, where you can have 3 people handle acres & acres.

I'll be around explaining how coffee beans can't be picked by a tractor or how vanilla can't be pollinated with it.

  • And I'll be around explaining why it's a bad idea to stockpile $X00,000,000 worth of Equipment in Columbia, where coffee grows readily.

    Capital intensive industries require low crime and geopolitical stability. Strongman politics means that investors who buy such equipment will simply be robbed at literal gunpoint by local gangs.

  • I may be mistaken, but I was under the impression that, largely, farmers do not own their equipment. They lease it, and it costs a lot.

    Edit: Also, 3 people can handle 100 acres of land, given the crop. That happens today.

> I don’t understand how a business model that makes money pops out on the other end.

What issues do you see?

I pay for ChatGPT and for cursor and to me that's money very well spent.

I imagine tools like cursor will become common for other text intensive industries, like law, soon.

Agreed that the hype can be over the top, but these are valuable productivity tools, so I have some trouble understanding where you're coming from.

  • I feel like the raw numbers kind of indicate that the amount of money spent on training, salary, and overhead doesn't add up. "We'll beat them in volume" keeps jumping out at me.

  • Question is whether these companies are profitable off the services they're providing, or still being propped up by all the VC money pouring in.

  • What you're paying for ChatGPT is not likely covering their expenses, let alone making up their massive R&D investment. People paid for Sprig and Munchery too, but those companies went out of business. Obviously what they developed wasn't nearly as significant as what OpenAI has developed, but the question is: where will their pricing land once they need to turn a profit? It may well end up in a place where it's not worth paying ChatGPT to do most of the things it would be transformative for at its current price.

    [1]: https://www.fooddive.com/news/sprig-is-the-latest-meal-deliv...

    [2]:https://techcrunch.com/2019/01/21/munchery-shuts-down/?gucco...

    • Looking at history, anything in its first few iterations costs insane and stay as luxury or is sold at massive loss. Once the research goes on for several years, the costs keep coming down first very slowly and then in avalanche . The question always remains to which one can continue “selling at a loss” long enough to last the point until the costs continue going down while people are used to paying standard price(see smartphones), or the product is so market dominant that competition does not have resources to compete and cost can be raised(see Netflix).

      1 reply →

good point about the business model. probably AI has more even the ones reaping the rewards are only 4 or 5 big corps.

It seems with crypto the business "benefits" were mostly adversarial (winners were those doing crimes on the darknet, or to allow ransomware operators to get paid). The underlying blockchain Tech itself though failed to replace transactions in a database.

The main value for AI today seems to be generative Tech to improve the quality of Deepfakes or to help everyone in Business write their communication with an even more "neutral" non-human like voice, free of any emotion, almost psychopathic. Like the dudes who are writing about their achievements on LinkedIn in 3rd person, ... Only now it's psychopathy enabled by the machine.

Also I've seen people who, without AI are barely literate, are now sending emails that look like they've been penned by a post-doc in English literature. The result is it's becoming a lot harder to separate the morons, and knuckle-draggers from those who are worth reaching out and talking to.

yes old man yelling at cloud.

  • +1. The other concern is that AI is potentially removing junior level jobs that are often a way for people to gain experience before stepping up into positions with more agency and autonomy. Which means in future we will be dealing with the next generation of "AI told me to do this", but "I have no experience to know whether this is good or not", so "let's do it".

    • On the contrary, Junior positions will be more and growing because now you do not have the fear of Juniors being liability for a while as with very little guidance they can get stuff done while Seniors keep an eye on bigger things and have free time not being spammed by n00b queries.

      Also, given the saturation of STEM graduates now, you have proven group of juniors who can learn themselves over expensive lottery of bootcampers who might bail out the moment it is no longer the surface level React anymore.

      With AI, more tiny businesses can launch into market and hire juniors and part time expertise to guide the product slowly without the massive VC money requirement.

Crypto is coming back for another heist. Will probably die a bit once Trump finishes his term