this post was submitted on 27 Dec 2024
83 points (100.0% liked)

TechTakes

1493 readers
110 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Evidence for the DDoS attack that bigtech LLM scrapers actually are.

you are viewing a single comment's thread
view the rest of the comments
[–] Monument@lemmy.sdf.org 12 points 1 day ago* (last edited 1 day ago) (2 children)

I’m caught on the other side of the whack a mole game. The tools I use at work to check the health of my site - specifically that links on my site aren’t broken - now render an extremely high false positive rate, as other sites serve up a whole slew of error messages to the bot that just wants to make sure the link points to a working page.

[–] froztbyte@awful.systems 7 points 1 day ago (1 children)

I'm not sure I understand your comment, mind elaborating on the details?

[–] Monument@lemmy.sdf.org 13 points 1 day ago* (last edited 1 day ago)

Sure!

One of the things I do is monitor my organization’s website to ensure that it’s functional for our visitors.
We have a few hundred web pages, so we use a service to monitor and track how we’re doing. The service is called SiteImprove. They track a number of metrics, such as SEO, accessibility, and of course, broken links. (I couldn’t tell you if the service is ‘good’ - I don’t have a basis for comparison.) So, SiteImprove uses robots to crawl our website, and analyze it for the above stuff. When their robots find a link on our site, they try to follow it. If the destination reports back an error, the error gets logged and put into a report that I review.

Basically, in the last 6ish months, we went from having less than 5 false positives a month to having over a hundred every month.
Before, a lot of those false positives were ‘server took too long to respond’ without a corresponding error code - which happens. Sometimes a server goes down, then comes back up by the time I’m looking at the reports. However, now, a lot of these reports are coming back with html status messages, such as 400: Bad Request, 403: Forbidden, 502: Bad Gateway, or 503: Service Unavailable. I even got a 418 a few months ago, which tickled me pink. It’s my favorite HTML status (and probably the most appropriate one to roll bots with). Which is to say that instead of a server being down or whatever, a server saw the request, and decided to respond in one of the above ways.

And while I can visit the URL in a browser, the service will repeatedly get these errors when they send their bots to double check the link destinations, so I’m reasonably confident it’s something with the bots getting blocked more aggressively than they were in the past.

Edit: Approximately 10 minutes after I posted this comment, our CDN blocked the bot, too. Now it’s reporting all internal links as broken, too. So… every link on every page. I guess I’m taking it easy today!