Comment by chaos_emergent

20 hours ago

> Inference costs, not training costs.

Why does training cost matter if you have a general intelligence that can do the task for you, that’s getting cheaper to run the task on?

> for quick mockups they’re serviceable

I know multiple startups that use LLMs as their core bread-and-butter intelligence platform instead of tuned but traditional NLP models

> take the word of domain experts

I guess? I wouldn’t call myself an expert by any means but I’ve been working on NLP problems for about 5 years. Most people I know in NLP-adjacent fields have converged around LLMs being good for most (but obviously not all) problems.

> kind of an insult

Depends on whether you think OP intended to offend, ig

> Why does training cost matter if you have a general intelligence that can do the task for you, that’s getting cheaper to run the task on?

Assuming we didn’t need to train it ever again, it wouldn’t. But we don’t have that, so…

> I know multiple startups that use LLMs as their core bread-and-butter intelligence platform instead of tuned but traditional NLP models

Okay? Did that system write itself entirely? Did it replace the programmers that actually made it?

If so, they should pivot into a Devin competitor.

> Most people I know in NLP-adjacent fields have converged around LLMs being good for most (but obviously not all) problems.

Yeah LLMs are quite good at comming NLP tasks, but AFAIK are not SOTA at any specific task.

Either way, LLMs obviously don’t kill the need for the NLP field.