← Back to context

Comment by 2001zhaozhao

7 hours ago

I am seriously considering binge buying local AI inference hardware. The way this is going, there will be another big GPU crunch soon because everyone will need local models and/or open model inference capacity to do their programming tasks when the subsidized subscriptions are no longer flowing.

I just bought a 3090 off of Amazon. It was expensive ($1500ish), but covered under prime and an included any reason 90 return policy.

But that gives me a good while to determine if it's worth it or not. I've heard good and bad, so here's hoping for good or close to it.

I wasn't going to fork out $1000 on a chance it might be enough with a rough return strategy.