← Back to context

Comment by input_sh

3 months ago

The same company that slopifies news stories in their previous big "feature"? The irony.

I think you're referencing https://kite.kagi.com/

In my view, it's different to ask AI to do something for me (summarizing the news) than it is to have someone serve me something that they generated with AI. Asking the service to summarize the news is exactly what the user is doing by using Kite—an AI tool for summarizing news.

(I'm a Kagi customer but I don't use Kite.)

  • I'm just realizing that while I understand (and think it's obvious) that this tool uses AI to summarize the news, they don't really mention it on-page anywhere. Unless I'm missing it? I think they used to, but maybe I'm mis-remembering.

    They do mention "Summaries may contain errors. Please verify important information." on the loading screen but I don't think that's good enough.

  • https://news.kagi.com/world/latest

    Where's the part where you ask them to do this? Is this not something they do automatically? Are they not contributing to the slop by republishing slopified versions of articles without as much as an acknowledgement of the journalists whose stories they've decided to slopify?

    If they were big enough to matter they would 100% get sued over this (and rightfully so).

    • > Where's the part where you ask them to do this? Is this not something they do automatically?

      It's a tool. Summarizing the news using AI is the only thing that tool does. Using a tool that does one thing is the same as asking the tool to do that thing.

      > Are they not contributing to the slop by republishing slopified versions of articles without as much as an acknowledgement of the journalists whose stories they've decided to slopify?

      They provide attribution to the sources. They're listed under the headline "Sources" right below the short summary/intro.

      3 replies →

Been using Kagi for two years now. Their consistent approach to AI is to offer it, but only when explicitly requested. This is not that surprising with that in mind.

Not all "AI"-generated content can be categorized as "slop". "Slop" has a specific meaning, usually associated with spam and low-effort content. What Kagi News is doing is summarizing news articles from different sources, and applying a custom structure and format. It is a branded product supported by a reputable company, not a low-effort spam site.

I'm a firm skeptic of the current hype around this technology, but I think it is foolish to think that it doesn't have good applications. Summarizing text content is one such use case, and IME the chances for the LLM to produce wrong content or hallucinate are very small. I've used Kagi News a number of times over the past few months, and I haven't spotted any content issues, aside from the tone and structure not quite matching my personal preferences.

Kagi is one of the few companies that is pragmatic about the positive and negative aspects of "AI", and this new feature is well aligned with their vision. It is unfair to criticize them for this specifically.

  • > "Slop" has a specific meaning, usually associated with spam and low-effort content.

    Slop means different things to different people. And anything not human reviewed is low effort in my view.