Comment by omneity
13 days ago
This is great, congrats for launching!
A couple of ideas .. I would like to benchmark a remote headless server, as well as different methods to run the LLM (vllm vs tgi vs llama.cpp ...) on my local machine, and in this case llamafile is quite limiting. Connecting over an OpenAI-like API instead would be great!
LocalScore dev here
Thank you! I think this is quite possible! If you don't mind starting a discussion on this I would love to think aloud there
https://github.com/cjpais/LocalScore/discussions