We have paused all crawling as of Feb 6th, 2025 until we implement robots.txt support. Stats will not update during this period.

  • Semi-Hemi-Lemmygod@lemmy.world
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    1
    ·
    24 hours ago

    Robots.txt is a lot like email in that it was built for a far simpler time.

    It would be better if the server could detect bots and send them down a rabbit hole rather than trusting randos to abide by the rules.

    • jagged_circle@feddit.nl
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      53 minutes ago

      It is not possible to detect bots. Attempting to do so will invariably lead to false positives denying access to your content to what is usually the most at-risk & marginalized folks

      Just implement a cache and forget about it. If read only content is causing you too much load, you’re doing something terribly wrong.

    • poVoq@slrpnk.net
      link
      fedilink
      English
      arrow-up
      18
      ·
      24 hours ago

      Because of AI bots ignoring robots.txt (especially when you don’t explicitly mention their user-agent and rather use a * wildcard) more and more people are implementing exactly that and I wouldn’t be surprised if that is what triggered the need to implement robots.txt support for FediDB.