Comment by antves
6 days ago
Yes, it works out-of-the-box with any OpenAI compatible API, including Ollama.
You can check out our example at https://github.com/circlemind-ai/fast-graphrag/blob/main/exa...
6 days ago
Yes, it works out-of-the-box with any OpenAI compatible API, including Ollama.
You can check out our example at https://github.com/circlemind-ai/fast-graphrag/blob/main/exa...
it would be nice to see an example that uses ollama - given that ollamas embeddings endpoint is a bit... different, I can't quite figure this out
Hey! Our todo list is a bit swamped with things right now, but we'll try to have a look at that as soon as possible. On the Ollama github I found contrasting information: https://github.com/ollama/ollama/issues/2416 and https://github.com/ollama/ollama/pull/2925 They also suggest to look at this: https://github.com/severian42/GraphRAG-Local-UI/blob/main/em...
Hope this can help!
litellm has ollama, you could go through that