Comment by bobbiechen

7 months ago

I do think the answer is two-pronged: roll out the red carpet for "good bots", add friction for "bad bots".

I work for Stytch and for us, that looks like:

1) make it easy to provide Connected Apps experiences, like OAuth-style consent screens "Do you want to grant MyAgent access to your Google Drive files?"

2) make it easy to detect all bots and shift them towards the happy path. For example, "Looks like you're scraping my website for AI training. If you want to see the content easily, just grab it all at /LLMs.txt instead."

As other comments mention, bot traffic is overwhelmingly malicious. Being able to cheaply distinguish bots and add friction makes your life as a defending team much easier.

IMO if it looks like a bot and doesn't follow robots.txt you should just start feeding it noise. Ignoring robots.txt makes you a bad netizen.