← Back to context

Comment by alexpotato

4 hours ago

I was just chatting with a co-worker that wanted to run a LLM locally to classify a bunch of text. He was worried about spending too many tokens though.

I asked him why he didn't just have the LLM build him a python ML library based classifier instead.

The LLMs are great but you can also build supporting tools so that:

- you use fewer tokens

- it's deterministic

- you as the human can also use the tools

- it's faster b/c the LLM isn't "shamboozling" every time you need to do the same task.