← Back to context

Comment by blindriver

1 day ago

That’s sort of like asking a horse and buggy driver whether automobiles are going to put them out of business.

I think for the most part, casual nlp is dead because of LLMs. And LLM costs are going to plummet soon, so large scale nlp that you’re talking about is probably dead within 5 years or less. The fact that you can replace programmers with prompts is huge in my opinion so no one needs to learn an nlm API anymore, just stuff it into a prompt. Once costs to power LLMs decrease to meet the cost of programmers it’s game over.

> LLM costs

Inference costs, not training costs.

> The fact that you can replace programmers

You can’t… not for any real project. For quick mockups they’re serviceable

> That’s sort of like asking a horse and buggy driver whether automobiles

Kind of an insult to OP, no? Horse and buggy drivers were not highly educated experts in their field.

Maybe take the word of domain experts rather than AI company marketing teams.

  • > Inference costs, not training costs.

    Why does training cost matter if you have a general intelligence that can do the task for you, that’s getting cheaper to run the task on?

    > for quick mockups they’re serviceable

    I know multiple startups that use LLMs as their core bread-and-butter intelligence platform instead of tuned but traditional NLP models

    > take the word of domain experts

    I guess? I wouldn’t call myself an expert by any means but I’ve been working on NLP problems for about 5 years. Most people I know in NLP-adjacent fields have converged around LLMs being good for most (but obviously not all) problems.

    > kind of an insult

    Depends on whether you think OP intended to offend, ig

    • > Why does training cost matter if you have a general intelligence that can do the task for you, that’s getting cheaper to run the task on?

      Assuming we didn’t need to train it ever again, it wouldn’t. But we don’t have that, so…

      > I know multiple startups that use LLMs as their core bread-and-butter intelligence platform instead of tuned but traditional NLP models

      Okay? Did that system write itself entirely? Did it replace the programmers that actually made it?

      If so, they should pivot into a Devin competitor.

      > Most people I know in NLP-adjacent fields have converged around LLMs being good for most (but obviously not all) problems.

      Yeah LLMs are quite good at comming NLP tasks, but AFAIK are not SOTA at any specific task.

      Either way, LLMs obviously don’t kill the need for the NLP field.

  • Reply didn’t say that the expert is uneducated, just that their tool is obsolete. Better look at facts the way they are, sugar coating doesn’t serve anyone.

  • > Maybe take the word of domain experts rather than AI company marketing teams.

    Appeal to authority is a well known logical fallacy.

    I know how dead NLP is personally because I’ve never been able to get NLP working but once ChatGPT came around, I was able to classify texts extremely easily. It’s transformational.

    I was able to get ChatGPT to classify posts based on how political it was from a scale of 1 to 10 and which political leaning they were and then classify the persons likely political affiliations.

    All of this without needing to learn any APIs or anything about NLPs. Sorry but given my experience, NLPs are dead in the water right now, except in terms of cost. And cost will go down exponentially as they always do. Right now I’m waiting for the RTC 5090 so I can just do it myself with open source LLM.

    • > NLPs are dead in the water right now, except in terms of cost.

      False.

      With all due respect, the fact that you're referring to natural language parsing as "NLPs" makes me question whether you have any experience or modest knowledge around this topic, so it's rather bold of you to make such sweeping generalizations.

      It works for your use case because you're just one person running it on your home computer with consumer hardware. Some of us have to run NLP related processing (POS taggers, keyword extraction, etc) in a professional environment at tremendous scale, and reaching for an LLM would absolutely kill our performance.

      1 reply →

    • “I couldn’t be bothered learning something, and now I don’t have to! Checkmate!”

      While LLM’s can have their uses, let’s not get carried away.

      1 reply →

    • Performance and cost are trade-offs though. You could just as well say that LLMs are dead in the water, except in terms of performance.

      It does seem likely we’ll soon have cheap enough LLM inference to displace traditional NLP entirely, although not quite yet.

    • > Appeal to authority is a well known logical fallacy.

      I did not make an appeal to authority. I made an appeal to expertise.

      It’s why you’d trust a doctor’s medical opinion over a child’s.

      I’m not saying “listen to this guy because their captain of NLP” I’m saying listen because experts have spent years of hands on experience with things like getting NLP working at all.

      > I know how dead NLP is personally because I’ve never been able to get NLP working

      So you’re not an expert in the field. Barely know anything about it, but you’re okay hand waving away expertise bc you got a toy NLP Demo working…

      That’s great, dude.

      > I was able to get ChatGPT to classify posts based on how political it was from a scale of 1 to 10

      And I know you didn’t compare the results against classic NLP to see if there was any improvements because you don’t know how…

      2 replies →

> The fact that you can replace programmers with prompts

No, you can't. The only thing LLM's replace is internet commentators.

  • As I explained below, I avoided having to learn anything about ML, PyTorch or any other APIs when trying to classify posts based on how political they were and which affiliation they were. That was holding me back and it was easily replaced by an llm and a prompt. Literally took me minutes what would have taken days or weeks and the results are more than good enough.

    • > what would have taken days or weeks

      Nah, searching Stackoverflow and Github doesn't take "weeks".

      That said, due to how utterly broken internet search is nowadays, using an LLM as a search engine proxy is viable.

    • GPT 3.5 is more accurate at classifying tweets as liberal than it is at identifying posts that are conservative.

      If you're going for rough approximation, LLMs are great, and good enough. More care and conventional ML methods are appropriate as the stakes increase though.

      1 reply →

>The fact that you can replace programmers with prompts

this is how you end up with 1000s of lines of slop that you have no idea how it functions.