← Back to context

Comment by yowlingcat

2 years ago

Don't fall for the apocalyptic doomsday narrative. It is in the interests /of/ the 900 lb gorillas in the space to make it seem like it requires so much investment that you may as well not bother.

But there's a thriving community on HuggingFace and Reddit showing what you can do with the lo-fi versions. In particular, the evolution of lower bit inference (and I believe training as well even) has reduced memory requirements and because of that hardware requirements considerably. There is a lot you can do with your own local gen AI model running on your personal machine.