![](/static/61a827a1/assets/icons/icon-96x96.png)
![](https://lemmy.world/pictrs/image/eb9cfeb5-4eb5-4b1b-a75c-8d9e04c3f856.png)
From your own wiki link
robots.txt is the filename used for implementing the Robots Exclusion Protocol, a standard used by websites to indicate to visiting web crawlers and other web robots which portions of the website they are allowed to visit.
How is fedidb not an “other web robot”?
Okay,
So why should reinevent a standard when one that serves functionally the same purpose with one of implied consent?
Edit: my problem isn’t robots.txt. It’s implied consent.
If you are ever thinking, I wonder if I should ask, the answer is always yes. Doesn’t matter the situation. If you are not 1000% sure you have consent, you don’t. That’s just my ethics.
If you want to propose a new standard, go nuts. But implied consent is not it.