Slacker News Slacker News logo featuring a lazy sloth with a folded newspaper hat
  • top
  • new
  • show
  • ask
  • jobs
Library

Comment by dangoodmanUT

8 days ago

Yes everyone should just write cpp to call local LLMs obviously

2 comments

dangoodmanUT

Reply

otabdeveloper4  7 days ago

Yes, but llama.cpp already comes with a ready-made OpenAI-compatible inference server.

  • reverius42  7 days ago

    I think people are getting hung up on the "llama.cpp" name and thinking they need to write C++ code to use it.

    llama.cpp isn't (just) a C++ library/codebase -- it's a CLI application, server application (llama-server), etc.

Slacker News

Product

  • API Reference
  • Hacker News RSS
  • Source on GitHub

Community

  • Support Ukraine
  • Equal Justice Initiative
  • GiveWell Charities