• Eager Eagle
    link
    fedilink
    English
    1611 months ago

    Or don’t do anything. There are plenty of crawlers out there and disallowing won’t stop the unethical ones.

  • Voyajer
    link
    fedilink
    13
    edit-2
    11 months ago

    “Please label all of your interesting text so we can flag it with our webcrawler to train on later.”

  • WasPentalive
    link
    fedilink
    9
    edit-2
    11 months ago

    Is there some way you could have your web server log who scrapes the site? If you disallow ChatGPT and still find that it has scraped your site would you have cause to sue? @legaleagle (or anyone else too)

    • Cyclohexane
      link
      fedilink
      811 months ago

      It’s gotta be pretty difficult to differentiate human users from bots. If it was easy, you could prevent bots from loading the page altogether.

  • HousePanther
    link
    fedilink
    English
    711 months ago

    I’m going to do that tomorrow for my blog site. There’s no way I am letting ChatGPT crawl my shit.

  • ExpensiveConstant
    link
    fedilink
    611 months ago

    I mean, you can add their user agent to the robots file but the crawler could just change their user agent or even ignore the robots file if the server isn’t filtering requests by user agent