Comment by kkzz99
7 months ago
If the economical case justifies it you can use a cheap or lower end model to generate the meta information. Considering how cheap gpt-4o-mini is, seems pretty plausible to do that.
At my startup we also got pretty good results using 7B/8B models to generate meta information about chunks/parts of text.
No comments yet
Contribute on Hacker News ↗