← Back to context

Comment by blindriver

1 day ago

> Maybe take the word of domain experts rather than AI company marketing teams.

Appeal to authority is a well known logical fallacy.

I know how dead NLP is personally because I’ve never been able to get NLP working but once ChatGPT came around, I was able to classify texts extremely easily. It’s transformational.

I was able to get ChatGPT to classify posts based on how political it was from a scale of 1 to 10 and which political leaning they were and then classify the persons likely political affiliations.

All of this without needing to learn any APIs or anything about NLPs. Sorry but given my experience, NLPs are dead in the water right now, except in terms of cost. And cost will go down exponentially as they always do. Right now I’m waiting for the RTC 5090 so I can just do it myself with open source LLM.

> NLPs are dead in the water right now, except in terms of cost.

False.

With all due respect, the fact that you're referring to natural language parsing as "NLPs" makes me question whether you have any experience or modest knowledge around this topic, so it's rather bold of you to make such sweeping generalizations.

It works for your use case because you're just one person running it on your home computer with consumer hardware. Some of us have to run NLP related processing (POS taggers, keyword extraction, etc) in a professional environment at tremendous scale, and reaching for an LLM would absolutely kill our performance.

  • My understanding is that inference models can absolutely scale down, we are only at the beginning of these getting minimized, and they are trivial to parallelize. That's not a good combo to be against them, their price/performance/efficiency will quickly drop/grow/grow.

Performance and cost are trade-offs though. You could just as well say that LLMs are dead in the water, except in terms of performance.

It does seem likely we’ll soon have cheap enough LLM inference to displace traditional NLP entirely, although not quite yet.

> Appeal to authority is a well known logical fallacy.

I did not make an appeal to authority. I made an appeal to expertise.

It’s why you’d trust a doctor’s medical opinion over a child’s.

I’m not saying “listen to this guy because their captain of NLP” I’m saying listen because experts have spent years of hands on experience with things like getting NLP working at all.

> I know how dead NLP is personally because I’ve never been able to get NLP working

So you’re not an expert in the field. Barely know anything about it, but you’re okay hand waving away expertise bc you got a toy NLP Demo working…

That’s great, dude.

> I was able to get ChatGPT to classify posts based on how political it was from a scale of 1 to 10

And I know you didn’t compare the results against classic NLP to see if there was any improvements because you don’t know how…

  • > I did not make an appeal to authority. I made an appeal to expertise.

    Lol

    > I’m saying listen because experts have spent years of hands on experience with things like getting NLP working at all.

    “It is difficult to get a man to understand something, when his salary depends on his not understanding it.”

    Upton Sinclair

    > Barely know anything about it, but you’re okay hand waving away expertise bc you got a toy NLP Demo working…

    Yes that’s my point. I don’t know anything about implementing an NLP but got something that works pretty well using an LLM extremely quickly and easily.

    > And I know you didn’t compare the results against classic NLP to see if there was any improvements because you don’t know NLP…

    Do you cross reference all your Google searches to make sure they are giving you the best results vs Bing and DDG?

    Do you cross reference the results from your NLP with LLMs to see if there were any improvements?

    • > Lol

      Great argument

      > “It is difficult to get a man to understand something, when his salary depends on his not understanding it.”

      NLP professionals are also LLM professionals. LLMs are tools in an NLP toolkit. LLMs don’t make the NLP professional obsolete the way it makes handwritten spam obsolete.

      I was going to explain this further but you literally wouldn’t understand.

      > Do you cross reference all your Google searches to make sure they are giving you the best results vs Bing and DDG?

      …Yes I do…

      That’s why I cancelled my kagi subscription. It was just as good as DDG.

      > Do you cross reference the results from your NLP with LLMs to see if there were any improvements?

      Yes I do… because I want to use the best tool for the job. Not just the first one I was able to get working…

I haven’t understood these types of uses. How do you validate the score that the LLM gives?

  • The same way you validate scores given by NLPs I assume. You run various tests and look at the results and see if they match what you would expect.