Comment by oliwary
1 day ago
This article seems to be paywalled unfortunately. While LLMs are very useful when the tasks are complex and/or there is not a lot of training data, I still think traditional NLP pipelines have a very important role to play, including when:
- Depending on the complexity of the task and the required results, SVMs or BERT can be enough in many cases and take much lower resources, especially if there is a lot of training data available. Training these models with LLM outputs could also be an interesting approach to achieve this.
- When resources are constrained or latency is important.
- In some cases, there may be labeled data in certain classes that have no semantic connection between them, e.g. explaining the class to LLMs could be tricky.
https://archive.is/J53CE
> This article seems to be paywalled unfortunately.
I am no fan of Medium paywalled articles but if it helps you, here's the article on archive - https://archive.is/J53CE