• @ColinHayhurst
    link
    English
    29
    edit-2
    22 days ago

    You should put these entries into your robots.txt file.

    To block the Google search crawler use for all of your site:

    User-agent: Googlebot

    Disallow: /

    To block the Google AI crawler use:

    User-agent: Google-Advanced

    Disallow: /

    • Jamyang
      link
      English
      222 days ago

      What if I made a static site using Github pages hosting? Will having a robots.txt in my root folder ward off Google bhoots (devils)?