this post was submitted on 27 Apr 2024
12 points (100.0% liked)

Linux

48145 readers
903 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
12
submitted 6 months ago* (last edited 6 months ago) by Deckweiss@lemmy.world to c/linux@lemmy.ml
 

I was reading the reddit thread on Claude AI crawlers effectively DDOSing Linux Mint forums https://libreddit.lunar.icu/r/linux/comments/1ceco4f/claude_ai_name_and_shame/

and I wanted to block all ai crawlers from my selfhosted stuff.

I don't trust crawlers to respect the Robots.txt but you can get one here: https://darkvisitors.com/

Since I use Caddy as a Server, I generated a directive that blocks them based on their useragent. The content of the regex basically comes from darkvisitors.

Sidenote - there is a module for blocking crawlers as well, but it seemed overkill for me https://github.com/Xumeiquer/nobots

For anybody who is interested, here is the block_ai_crawlers.conf I wrote.

(blockAiCrawlers) {
  @blockAiCrawlers {
    header_regexp User-Agent "(?i)(Bytespider|CCBot|Diffbot|FacebookBot|Google-Extended|GPTBot|omgili|anthropic-ai|Claude-Web|ClaudeBot|cohere-ai)"
  }
  handle @blockAiCrawlers {
    abort
  }
}

# Usage:
# 1. Place this file next to your Caddyfile
# 2. Edit your Caddyfile as in the example below
#
# ```
# import block_ai_crawlers.conf
#
# www.mywebsite.com {
#   import blockAiCrawlers
#   reverse_proxy * localhost:3000
# }
# ```
top 7 comments
sorted by: hot top controversial new old
[–] arisunz@lemmy.blahaj.zone 8 points 6 months ago (5 children)
[–] winnie@lemmy.ml 3 points 6 months ago

Suggestion at the end:

  <a class="boom" href="https://boom .arielaw.ar">hehe</a>

Wouldn't it destroy GoogleBot (and other search engine) those making your site delisted from Search?

[–] JustARegularNerd@aussie.zone 2 points 6 months ago

I just want you to know that was an amazing read, was actually thinking "It gets worse? Oh it does. Oh, IT GETS EVEN WORSE?"

[–] pvq@lemmy.ml 1 points 6 months ago

I really like your site's color scheme, fonts, and overall aesthetics. Very nice!

[–] jkrtn@lemmy.ml 1 points 6 months ago

This is one of the best things I've ever read.

I'd love to see a robots.txt do a couple safe listings, then a zip bomb, then a safe listing. It would be fun to see how many log entries from an IP look like get a, get b, get zip bomb.... no more requests.

[–] ABasilPlant@lemmy.world 1 points 6 months ago* (last edited 6 months ago)

In dark mode, the anchor tags are difficult to read. They're dark blue on a dark background. Perhaps consider something with a much higher contrast?

A picture of a website with a dark purple background and dark blue links.

Apart from that, nice idea - I'm going to deploy the zipbomb today!

[–] hollyberries@programming.dev 1 points 6 months ago

I'm a fan of hellpotting them.