this post was submitted on 27 Dec 2024
84 points (100.0% liked)

TechTakes

1496 readers
128 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Evidence for the DDoS attack that bigtech LLM scrapers actually are.

you are viewing a single comment's thread
view the rest of the comments
[–] Jayjader@jlai.lu 19 points 2 days ago (2 children)

How feasible is it to configure my server to essentially perform a reverse-slow-lorris attack on these LLM bots?

If they won't play nice, then we need to reflect their behavior back onto themselves.

Or perhaps serve a 404, 304 or some other legitimate-looking static response that minimizes load on my server whilst giving then least amount of data to train on.

[–] raoul@lemmy.sdf.org 17 points 2 days ago* (last edited 1 day ago) (1 children)

The only simple possibles ways are:

  • robot.txt
  • rate limiting by ip
  • blocking by user agent

From the article, they try to bypass all of them:

They also don't give a single flying fuck about robots.txt ...

If you try to rate-limit them, they'll just switch to other IPs all the time. If you try to block them by User Agent string, they'll just switch to a non-bot UA string (no, really). This is literally a DDoS on the entire internet.

It then become a game of whac a mole with big tech 😓

~~The more infuriating for me is that it's done by the big names, and not some random startup.~~ Edit: Now that I think about it, this doesn't prove it is done by Google or Amazon: it can be someone using random popular user agents

[–] jherazob@fedia.io 5 points 1 day ago

I do believe there's blocklists for their IPs out there, that should mitigate things a little

[–] raoul@lemmy.sdf.org 15 points 2 days ago* (last edited 1 day ago) (1 children)

A possibility to game this kind of bots is to add a hidden link to a randomly generated page, which contain itself a link to another random page, and so one.: The bots will still consume resources but will be stuck parsing random garbage indefinitely.

~~I know there is a website that is doing that, but I forget his name.~~

Edit: This is not the one I had in mind, but I find https://www.fleiner.com/bots/ describes be a good honeypot.