We have paused all crawling as of Feb 6th, 2025 until we implement robots.txt support. Stats will not update during this period.
We have paused all crawling as of Feb 6th, 2025 until we implement robots.txt support. Stats will not update during this period.
Okay,
So why should reinevent a standard when one that serves functionally the same purpose with one of implied consent?
Edit: my problem isn’t robots.txt. It’s implied consent.
If you are ever thinking, I wonder if I should ask, the answer is always yes. Doesn’t matter the situation. If you are not 1000% sure you have consent, you don’t. That’s just my ethics.
If you want to propose a new standard, go nuts. But implied consent is not it.