← Back to context

Comment by neilv

16 hours ago

True, but I think there's kitchen table opportunity in applications that don't need to do a big training, and that have tractable inferencing requirements.

The challenges I see are: (1) there's a lot of competition in the gold rush; (2) there's a lot of noise of AI slop implementations, including by anyone who sees your demo.

You also can fine tuned LLMs. For that, you don't need big money. You also can pick up a fine tuned LLM and go from there and make it better ( for your use case)