← Back to context

Comment by lone-cloud

3 months ago

Holy, your machine is a beast. 96GB of VRAM is pretty insane. I've been running a single 16GB VRAM AMD GPU. At the bottom of Gerbil's readme I listed out my setup where I use a 27b text gen model (gemma 3) but you'll be able to use much larger models and everything will run super fast.

Now as for your question, I started out with LM studio too, but the problem is that you'll need to juggle multiple apps if you want to do text gen or image gen or if you want to use a custom front-end. As an example, my favorite text gen front-end is "open webui" which gerbil can automatically set up for you (as long as you have Python's uv pre-installed). Gerbil will allow you to run text, image and video gen, as well as set up (and keep updated) any of the front-ends that I listed in my original post. I could be wrong but I'm not sure if LM studio can legally integrate GLP licensed software in the same way that Gerbil can because it's a closed source app.