Comment by dartos
3 days ago
Please tell me I’m not the only one that sees the irony in AI relying on classic search.
Obviously LLMs are good at some semantic understanding of the prompt context and are useful, but the irony is hilarious
3 days ago
Please tell me I’m not the only one that sees the irony in AI relying on classic search.
Obviously LLMs are good at some semantic understanding of the prompt context and are useful, but the irony is hilarious
I think the main bit here is that the knowledge graph is entirely constructed by LLMs. It's not just using a pre-existing knowledge graph. It's creating a knowledge graph on the fly based on your data.
Navigating the graph, on the other hand, is the perfect task for PageRank.
Exactly! Also PageRank is used to navigate the graph and find "missing links" between the concepts selected from the query using semantic search via LLMs (so to be able to find information to answer questions that require multi-hop or complex reasoning in one go).
Makes perfect sense.
The semantic understanding capabilities fit well for creating knowledge graphs.
What you should note as quaint is probably more like the integration of more "symbolic" strategies to NNs in AI.
Past the initial sensation, it is pretty linear that "something good at language" (an interface) be integrated with "something good at information retrieval" (the data). (Still sought what comes next, "something to give reliability to processing".)
It’s not AI, it’s a collection technologies and practices within the domain of AI, symbolic and sub symbolic. Arguably classic search is another technology/approach/algorithm with the domain of AI.
I don't get what the irony is here.
Not who you're replying to, but from my vantage point, marketing folks seem to be pushing LLM products as replacements for traditional search products. I think what the post is proposing makes perfect sense from a technical perspective, though. The utility of LLMs will come down to good old-fashioned product design, leveraging existing concepts, and novel technical innovation rather than just dumping quintillions of dollars into increasingly large models and hardware until it does everything for us.
Exactly this.
I work in the LLM-augmented search space, so I might be a little too tuned in on this subject.