Comment by darioush
20 days ago
Yes and the open source models + local inference are progressing rapidly. This whole API idea is kind of limited by the fact that you need to RT to a datacenter + trust someone with all your data.
Imagine when OpenAI has their 23&me moment in 2050 and a judge rules all your queries since 2023 are for sale to the highest bidder.
It doesn't need to wait until 2050. The queries would be for sale as soon as they stop providing a competitive advantage.
Even worse for these LLM-as-a-service companies i that the utility of open source LLMs largely comes down to the customization: you can get a lot of utility by restricting token output, varying temperature, and lightly retraining them for specific applications.
The use-cases for LLMs seem unexplored beyond basic chatbot stuff.
I'm surprised at how little their utility for turning unstructured data into structured data, even with some margin of error, is discussed. It doesn't even take an especially large model to accomplish it, either.
I would think entire industries could reform around having an LLM as a first pass on data, with software and/or human error checking at significant cost reduction over previous strategies.
The software-based second pass is where the most value lies (and the hard problems)