← Back to context

Comment by ivape

2 days ago

What is the default BrowserOS model? Is it local, and if so, what inferencing server are you using?

The default model is Gemini.

You can bring your own API keys and change the default to any model you local.

Or better run model locally using Ollama and use that!

  • The default is a remote Gemini call?

    • Yes, right now using Gemini API on our liteLLM server (we can't expose API key on client side).

      We are working on smaller, fine-tuned model too, which will be the default soon! It should be much faster and precise at navigation tasks.