Comment by ivape 7 months ago What is the default BrowserOS model? Is it local, and if so, what inferencing server are you using? 3 comments ivape Reply felarof 7 months ago The default model is Gemini.You can bring your own API keys and change the default to any model you local.Or better run model locally using Ollama and use that! ivape 7 months ago The default is a remote Gemini call? felarof 7 months ago Yes, right now using Gemini API on our liteLLM server (we can't expose API key on client side).We are working on smaller, fine-tuned model too, which will be the default soon! It should be much faster and precise at navigation tasks.
felarof 7 months ago The default model is Gemini.You can bring your own API keys and change the default to any model you local.Or better run model locally using Ollama and use that! ivape 7 months ago The default is a remote Gemini call? felarof 7 months ago Yes, right now using Gemini API on our liteLLM server (we can't expose API key on client side).We are working on smaller, fine-tuned model too, which will be the default soon! It should be much faster and precise at navigation tasks.
ivape 7 months ago The default is a remote Gemini call? felarof 7 months ago Yes, right now using Gemini API on our liteLLM server (we can't expose API key on client side).We are working on smaller, fine-tuned model too, which will be the default soon! It should be much faster and precise at navigation tasks.
felarof 7 months ago Yes, right now using Gemini API on our liteLLM server (we can't expose API key on client side).We are working on smaller, fine-tuned model too, which will be the default soon! It should be much faster and precise at navigation tasks.
The default model is Gemini.
You can bring your own API keys and change the default to any model you local.
Or better run model locally using Ollama and use that!
The default is a remote Gemini call?
Yes, right now using Gemini API on our liteLLM server (we can't expose API key on client side).
We are working on smaller, fine-tuned model too, which will be the default soon! It should be much faster and precise at navigation tasks.