• I am denied read-only access to some websites because I use a VPN. This makes no sense at all, but it happens anyway.
  • I am not allowed to register in some forums because I use a VPN. Because everyone knows that anyone who uses a VPN is a serious criminal. There is no other option.
  • I am subsequently banned from forums because the moderators realise that my IP address is not unique because I use a VPN. My posts don’t matter at all, IP addresses obviously unambiguously identify every person on this planet.
  • I’m supposed to confirm that I’m not a robot because I use a VPN. The fact that the company asking for these confirmations (usually Google) is itself sending robots marauding through the internet doesn’t matter, because Google is Google and I’m just a bloke with a VPN.

Guys, a VPN is self-defence. A website banning VPNs is like a brothel banning condoms. I mean, of course the house rules apply, but I’d like to see a bit more judgement. What’s happening right now is ridiculous and hardly does justice to the security aspect of these “tests”. If you find yourself as a contributor to this list, I urge you to stop. I am not a bad guy. All I do is use a VPN.

Thank you.

  • rhabarbaOP
    link
    fedilink
    310 months ago

    How does it defend a website to deny reading access to static content?

    • @Rossphorus
      link
      English
      1010 months ago

      Topical answer: Bots going around scraping content to feed into some LLM dataset without consent. If the website is anything like Reddit they’ll be trying to monetise bot access to their content without affecting regular users.

      • rhabarbaOP
        link
        fedilink
        -210 months ago

        It should be easy to distinguish a bot from a real user though, isn’t it?

        • @damnthefilibuster
          link
          1010 months ago

          Nope. It gets difficult every single day. Used to be easy - just check the user agent string. Real users will have a long one that talks about what browser they’re using. Bots won’t have it or will have one that mentions the underlying scraping library they’re using.

          But then bot makers wizened up. Now they just copy the latest browser agent string.

          Used to be that you could use mouse cursor movement to create heat maps and figure out if it’s a real user. Then some smart Alec went and created a basic script to copy his cursor movement and broke that.

          Oh, and then someone created a machine learning model to learn that behavior too and broke that even more.

        • @Rossphorus
          link
          English
          310 months ago

          Unfortunately not. The major difference between an honest bot and a regular user is a single text string (the user agent). There’s no reason that bots have to be honest though and anyone can modify their user agent. You can go further and use something like Selenium to make your bot appear even more like a regular user including random human-like mouse movements. There are also a plethora of tools to fool captchas now too. It’s getting harder by the day to differentiate.