More like "you need to sign up for our website and pay for a subscription", and I'd much rather do that if it's actually providing value. I am absolutely not going to run model locally which slowly churns out words at 5 tps while making the computer hot to touch.
I would very much like not to have to download 22 GB for some inference capability that is way worse than API calls both in terms of quality and speed.
I would rather pay money than seeing this thing running in my browser that only prints 5 tps on high-end consumer hardware.
True, but arguably better than "sorry, to use our website, you must have a ChatGPT subscription."
More like "you need to sign up for our website and pay for a subscription", and I'd much rather do that if it's actually providing value. I am absolutely not going to run model locally which slowly churns out words at 5 tps while making the computer hot to touch.
Also much better than every website wanting its own 22 GB rather than the 22 GB being a shared resource.
I would very much like not to have to download 22 GB for some inference capability that is way worse than API calls both in terms of quality and speed.
I would rather pay money than seeing this thing running in my browser that only prints 5 tps on high-end consumer hardware.
1 reply →
that is ~9% of the total available disk space for baseline phones and laptops for a model that is not that useful.