Comment by KaiserPro

2 months ago

Its not really a glass house.

Pike's main point is that training AI at that scale requires huge amounts of resources. Markov chains did not.

At the risk of being pedantic, it's not AI that requires massive resources, chatgpt 3.x was trained on a few million dollars. The jump to trillions being table stakes happened because everyone started using free services and there was just too much money in the hands of these tech companies. Among other things.

There are so many chickens that are coming home to roost where LLMs was just the catalyst.

  • > it's not AI that requires massive resources

    no it really is. If you took away training costs, OpenAI would be profitable.

    When I was at meta they were putting in something like 300k GPUs in a massive shared memory cluster just for training. I think they are planning to triple that, if not more.

  • Yeah for some reason AI energy use is so overreported. Using chatgpt for query does not even use two order of magnitude less energy compared to toasting a bread. And you can eat bread untoasted too if you care about energy use.

    [1]: https://epoch.ai/gradient-updates/how-much-energy-does-chatg...

    • How many slices of toast are you making a day?

      If you fly a plane a millimeter, you're using less energy than making a slice of toast; would you also say that it's accurate that all global plane travel is more efficient than making toast?

      1 reply →