• Voyajer
    link
    fedilink
    13
    edit-2
    1 year ago

    “Please label all of your interesting text so we can flag it with our webcrawler to train on later.”

  • Eager Eagle
    link
    English
    121 year ago

    Or don’t do anything. There are plenty of crawlers out there and disallowing won’t stop the unethical ones.

    • @[email protected]
      link
      fedilink
      English
      221 year ago

      Just because some people might break into my house doesn’t mean I’ll stop locking my doors.

      • Eager Eagle
        link
        English
        61 year ago

        that doesn’t lock anything, it’s not a security feature.

        • Arakwar
          link
          fedilink
          41 year ago

          A house door lock isn’t that much about security either.

          • @[email protected]
            link
            fedilink
            English
            41 year ago

            It’s a deterrent. Which is a pretty apt comparison for robots.txt and user agent blocking.

        • @5BC2E7
          link
          01 year ago

          That is the point. They don’t need to be secure to work as a deterrent

  • HousePanther
    link
    fedilink
    English
    71 year ago

    I’m going to do that tomorrow for my blog site. There’s no way I am letting ChatGPT crawl my shit.

  • WasPentalive
    link
    fedilink
    6
    edit-2
    1 year ago

    Is there some way you could have your web server log who scrapes the site? If you disallow ChatGPT and still find that it has scraped your site would you have cause to sue? @legaleagle (or anyone else too)

    • Cyclohexane
      link
      fedilink
      51 year ago

      It’s gotta be pretty difficult to differentiate human users from bots. If it was easy, you could prevent bots from loading the page altogether.

  • ExpensiveConstant
    link
    fedilink
    51 year ago

    I mean, you can add their user agent to the robots file but the crawler could just change their user agent or even ignore the robots file if the server isn’t filtering requests by user agent