Comment by mejutoco
16 days ago
> and every single week the results were slightly different.
This is one of the reasons why open source offline models will always be part of the solution, if not the whole solution.
16 days ago
> and every single week the results were slightly different.
This is one of the reasons why open source offline models will always be part of the solution, if not the whole solution.
Inconsistency comes from scaling - if you are optimizing your infra to be cos effective you will arrive at same tradeoffs. Not saying it's not nice to be able to make some of those decisions on your own - but if you're picking LLMs for simplicity - we are years away from running your own being in the same league for most people.
And if you are not you wont.
You can decide if you change your local setup or not. You cannot decide the same of a service.
There is nothing inevitable about inconsistency in a local setup.