Comment by byandrev
15 days ago
Unlike large models such as Gemini or ChatGPT, where information is extracted from numerous web sources that may contain “hallucinations,” NotebookLM relies 100% on the sources you provide, such as PDFs, audio files, YouTube videos, Google Docs, or even articles. By working exclusively with your sources, the tolerance for hallucinations is very low.
Huh don't most hallucinations come from the models internal knowledge and not the RAG?