← Back to context

Comment by kaveh_h

7 days ago

Have you tried providing multiple pages at a time to the model? It might do better transcription as it have bigger context to work with.

Gemini 3 long context is not good as Gemini 2.5

  • I'm 100% sure that all providers are playing with the quantization, kv cache and other parameters of the models to be able to serve the demand. One of the biggest advantage of running a local model is that you get predictable behavior.