Comment by kkzz99

1 year ago

If the economical case justifies it you can use a cheap or lower end model to generate the meta information. Considering how cheap gpt-4o-mini is, seems pretty plausible to do that.

At my startup we also got pretty good results using 7B/8B models to generate meta information about chunks/parts of text.