Comment by maxsieg
1 year ago
Hey, cofounder at Mutable.ai here.
I want to encourage you all to ask the chat some tough questions. You can ask very complex and general questions. Some examples: - Ask ollama (https://wiki.mutable.ai/ollama/ollama) how to add a new model - Ask langchain (https://wiki.mutable.ai/langchain-ai/langchain) "How can I build a simple personal assistant using this repo?" - Ask flash attention (https://wiki.mutable.ai/Dao-AILab/flash-attention) "What are the main benefits of using this code?"
It is also useful for search - for example, if you ask langchain "Where is the code that connects to vector databases?" it will surface all the relevant information.
Very curious to hear what you ask (and whether you find the response helpful)!
I’ve been building LLM-driven systems for customers for quite some time. We got tired of hallucinations from vector-based and hybrid RAGs last year, eventually arriving to the approach similar to yours.
It is even called Knowledge Mapping [1]. It works really well, and customers can understand it.
Probably the only difference with your approach is that we use different architectural patterns to map domain knowledge into bits of knowledge that LLMs will use to reason (Router, Knowledge Base, Search Scope, Workflows, Assistant etc)
My contacts are in the profile, if you want to bounce ideas!
[1] English article: https://www.trustbit.tech/en/wie-wir-mit-knowledge-maps-bess...