Comment by ivape 2 days ago What is the default BrowserOS model? Is it local, and if so, what inferencing server are you using? 3 comments ivape Reply felarof 2 days ago The default model is Gemini.You can bring your own API keys and change the default to any model you local.Or better run model locally using Ollama and use that! ivape 2 days ago The default is a remote Gemini call? felarof 2 days ago Yes, right now using Gemini API on our liteLLM server (we can't expose API key on client side).We are working on smaller, fine-tuned model too, which will be the default soon! It should be much faster and precise at navigation tasks.
felarof 2 days ago The default model is Gemini.You can bring your own API keys and change the default to any model you local.Or better run model locally using Ollama and use that! ivape 2 days ago The default is a remote Gemini call? felarof 2 days ago Yes, right now using Gemini API on our liteLLM server (we can't expose API key on client side).We are working on smaller, fine-tuned model too, which will be the default soon! It should be much faster and precise at navigation tasks.
ivape 2 days ago The default is a remote Gemini call? felarof 2 days ago Yes, right now using Gemini API on our liteLLM server (we can't expose API key on client side).We are working on smaller, fine-tuned model too, which will be the default soon! It should be much faster and precise at navigation tasks.
felarof 2 days ago Yes, right now using Gemini API on our liteLLM server (we can't expose API key on client side).We are working on smaller, fine-tuned model too, which will be the default soon! It should be much faster and precise at navigation tasks.
The default model is Gemini.
You can bring your own API keys and change the default to any model you local.
Or better run model locally using Ollama and use that!
The default is a remote Gemini call?
Yes, right now using Gemini API on our liteLLM server (we can't expose API key on client side).
We are working on smaller, fine-tuned model too, which will be the default soon! It should be much faster and precise at navigation tasks.