← Back to context

Comment by tempfile

6 days ago

> It proposes a new standard for AI-web interaction that moves beyond fragile screen scraping and DOM manipulation towards a robust, secure, and efficient machine-readable layer for the internet.

This is nothing like robots.txt, it is much more like a sitemap. In fact, this design goal is almost word for word the point of the semantic web in general. You may find that there are existing working groups for similar resource description frameworks. Given how poor adoption of semantic tagging has been, I somewhat doubt sites start doing it just for LLMs.

Incidentally, I thought the whole point of an AI agent was that it could read and understand things by itself. I welcome any improvement in the semantic content of the web, but isn't scraping kind of the point?

You made good points. Aura is more like a sitemap for actions than robotstxt. I just wanted to make Aura much simpler, using only plain JSON and HTTP. I think the Semantic Web was too complex for people to use. So why is now a good time for this idea? I think it s because of AI agents. Today big companies spend a lot of money on web scraping and their tools break all the time. This gives them a real reason to want a better way. And for your last question, can't AI just read the page? Yes it can, but it s very slow and it breaks easily. It s the difference between using a clean API versus guessing where to click on a page. Aura is just a way for a website to offer that clean API-like path. It's faster for everyone and doesnt break when a small thing like a button s color changes. Thanks for the feedback.