Comment by oceanplexian

5 days ago

The Anthropic API was already supported by llama.cpp (The project Ollama ripped off and typically lags in features by 3-6 months), which already works perfectly fine with Claude Code by setting a simple environment variable.

Point of clarification: llama.cpp is MIT-licensed. Using it downstream (commercially or otherwise) is exactly what that license allows, so calling it a rip-off is misleading.

And they reference that announcement and related information in the second line.

  • Which announcement are you looking at? I see no references to llama-cpp in either Ollama's blog post or this project's github page.