We have paused all crawling as of Feb 6th, 2025 until we implement robots.txt support. Stats will not update during this period.
We have paused all crawling as of Feb 6th, 2025 until we implement robots.txt support. Stats will not update during this period.
It is not possible to detect bots. Attempting to do so will invariably lead to false positives denying access to your content to what is usually the most at-risk & marginalized folks
Just implement a cache and forget about it. If read only content is causing you too much load, you’re doing something terribly wrong.