← Back to context

Comment by fouronnes3

7 days ago

Great example of the power of vibe coding. The first item is literally "Kiro: A new agentic IDE".

There is literally an input box to put terms you want to exclude...

The prompt asks for "filters out specific search terms", not "intelligently filter out any AI-related keywords." So yes, a good example of the power of vibe coding: the LLM built a tool according to the prompt.

  • So I have to stay up to date on AI stories just to know what buzzwords I should filter so I don't see AI stories?

    • Add the buzzword when you see a story you don't like. Or settle with it filtering 90% of the AI content and just don't click on whatever remains, I doubt you expect the top story to be interesting to you 100% of the time.

    • Our brain decodes info based on context and extrapolation

      This submission we're commenting on could be about filtering out any data, not just AI stuff. Politics, crypto, AI etc. Or more minute like "Trump" "fracking" "bitcoin" etc.

      In any of these scenarios, with a tool designed to filter out content based on limited context, when would you ever be perfectly satisfied?

      would you like AI to help you build the perfect context-filter model?

      2 replies →

    • Isn't it enough to bury yourself under the rock? - you want the fact of your having done so concealed from you also? But what about the fact of wanting that?

    • ...Yes? This is how this tool is coded. Machines do what one codes them to do, not what one wants them to do. If you're interested in making a more intelligent tool you can do it. This tool does exactly what @simonw says it does.

    • How about a version with LLM integration that detects "AI" related stories in a more clever way? /s

    • A tool was offered that can accomplish what you want, with a very small amount of added effort on your part.

      No, you do not have to "stay up to date on AI stories"—if you see one, add the keyword to the list and move on. There are not as many buzzwords as you seem to be implying, anyways.

      If you are dissatisfied, you are welcome to build your own intelligent version (but I am not sure this will be straightforward without the use of AI).

I like this because things can stay permanently filtered. Just not across devices. But that wasn't one of the original requirements.

Also a great example of how software can be perfectly to spec and also completely broken.

llm, ai, cuda, agent, gpt.

Wish it returned more unfiltered items tho.

  • Isn’t knocking out CUDA going to take out a significant chunk of GPGPU stuff with it? I can see wanting to avoid AI stuff, for sure, but I can’t imagine not wanting to hear anything about the high-bandwidth half of your computer…

    • Sure, if that suits you it makes sense. Just happened to take out one piece I felt was ai-related at the time. I suppose mlx would have worked better.