← Back to context

Comment by NitpickLawyer

12 hours ago

I don't think it's even a question. A 32b model will not compete with SotA for years to come (if ever). The idea behind this release is to fine-tune on your codebase and compare to non-finetuned open models from the same class (or one higher). So if you need local processing, without access to SotA (security, compliance, whatever) then this is an interesting avenue for you. And the cost is fairly low. They are releasing the method to do this on your own codebase / docs / processes.