Andy Reid to TechnologyEnglish • 10 months agoAI companies are violating a basic social contract of the web and and ignoring robots.txtwww.theverge.comexternal-linkmessage-square199arrow-up11.09Karrow-down115cross-posted to: [email protected][email protected][email protected]
arrow-up11.08Karrow-down1external-linkAI companies are violating a basic social contract of the web and and ignoring robots.txtwww.theverge.comAndy Reid to TechnologyEnglish • 10 months agomessage-square199cross-posted to: [email protected][email protected][email protected]
minus-square@BrianTheeBiscuiteerlinkEnglish55•10 months agoIf it doesn’t get queried that’s the fault of the webscraper. You don’t need JS built into the robots.txt file either. Just add some line like: here-there-be-dragons.html Any client that hits that page (and maybe doesn’t pass a captcha check) gets banned. Or even better, they get a long stream of nonsense.
minus-square@[email protected]linkfedilinkEnglish24•10 months agoserver { name herebedragons.example.com; root /dev/random; }
minus-square@[email protected]linkfedilinkEnglish16•10 months agoNice idea! Better use /dev/urandom through, as that is non blocking. See here.
minus-squareAniki 🌱🌿linkfedilinkEnglish0•10 months agoThat was really interesting. I always used urandom by practice and wondered what the difference was.
minus-squareAniki 🌱🌿linkfedilinkEnglish2•edit-210 months agoI wonder if Nginx would just load random into memory until the kernel OOM kills it.
minus-square@[email protected]linkfedilinkEnglish8•10 months agoI actually love the data-poisoning approach. I think that sort of strategy is going to be an unfortunately necessary part of the future of the web.
If it doesn’t get queried that’s the fault of the webscraper. You don’t need JS built into the robots.txt file either. Just add some line like:
here-there-be-dragons.html
Any client that hits that page (and maybe doesn’t pass a captcha check) gets banned. Or even better, they get a long stream of nonsense.
server {
name herebedragons.example.com; root /dev/random;
}
Nice idea! Better use
/dev/urandom
through, as that is non blocking. See here.That was really interesting. I always used urandom by practice and wondered what the difference was.
I wonder if Nginx would just load random into memory until the kernel OOM kills it.
I actually love the data-poisoning approach. I think that sort of strategy is going to be an unfortunately necessary part of the future of the web.