Comment by throwaway81998
3 months ago
Serious question, not a "what's the point of this" shitpost... My experience with local LLMs is limited.
Just installed LM Studio on a new machine today (2025 Asus ROG Flow Z13, 96GB VRAM, running Linux). Haven't had the time to test it out yet.
Is there a reason for me to choose Gerbil instead? Or something else entirely?
Holy, your machine is a beast. 96GB of VRAM is pretty insane. I've been running a single 16GB VRAM AMD GPU. At the bottom of Gerbil's readme I listed out my setup where I use a 27b text gen model (gemma 3) but you'll be able to use much larger models and everything will run super fast.
Now as for your question, I started out with LM studio too, but the problem is that you'll need to juggle multiple apps if you want to do text gen or image gen or if you want to use a custom front-end. As an example, my favorite text gen front-end is "open webui" which gerbil can automatically set up for you (as long as you have Python's uv pre-installed). Gerbil will allow you to run text, image and video gen, as well as set up (and keep updated) any of the front-ends that I listed in my original post. I could be wrong but I'm not sure if LM studio can legally integrate GLP licensed software in the same way that Gerbil can because it's a closed source app.
Thanks for the reply, I'll give Gerbil a try.
Not OP, but I am running ollama as a testing ground for various projects ( separately from gpt sub ).
<< Is there a reason for me to choose Gerbil instead? Or something else entirely?
My initial reaction is positive, because it seems to integrate everything without sacrificing being able to customize it further if need be. That said, did not test it yet, but now I will.