Comment by pzo
4 days ago
this is actually positive even for devs. The more users have ollama installed then you can release some desktop ai app for them and don't have to bundle additional models in your own app. Easier to provide to such user free or cheaper subscription because you don't have additional costs. Latest Qwen30B models area really powerful.
Would be even better if there was a installation template that checks if Ollama is installed and if not download it as sub installation first checking user computer specs if enough RAM and fast enough CPU/GPU. Also API to prompt user (ask for permission) to install specific model if haven't been installed.
> Would be even better if there was a installation template that checks if Ollama is installed and if not download it as sub installation first..... Also API to prompt user (ask for permission) to install specific model if haven't been installed.
That's actually what we've done for our own App [1]. It checks if Ollama and other dependencies are installed. No model is bundled with it. We prompt user to install a model (you pick a model, click a button and we download the model; similar if you wish to remove a model). The aim is to make it quite simple for non-technical folks to use.
1) https://ai.nocommandline.com/