Bots now browse like humans. We're proposing bots use cryptographic signatures so that website owners can verify their identity. Explanations and demonstration code can be found within the post.
For those building bots, we propose signing the authority of the target URI, i.e. www.example.com, and a way to retrieve the bot public key in the form of signature-agent, if present, i.e. crawler.search.google.com for Google Search, operator.openai.com for OpenAI Operator, workers.dev for Cloudflare Workers.
They’re proposing the request will include public key source information and request target. Through the public key source, you can verify the origin via source domain name.
The point is it makes them identifiable. If you block anything not authenticatable, and everything that auths via *.google.com, you are effectively blocking everything from Google.
If you fear they will evade to other domains, you’ll have to use an allow-list.
They’re proposing the request will include public key source information and request target. Through the public key source, you can verify the origin via source domain name.
So when that gets blocked, they can just generate a new key. I don’t see how this really stops anyone that wants to keep going.
The point is it makes them identifiable. If you block anything not authenticatable, and everything that auths via *.google.com, you are effectively blocking everything from Google.
If you fear they will evade to other domains, you’ll have to use an allow-list.