this post was submitted on 29 Oct 2025
1 points (100.0% liked)

Lemmy

14155 readers
1 users here now

Everything about Lemmy; bugs, gripes, praises, and advocacy.

For discussion about the lemmy.ml instance, go to !meta@lemmy.ml.

founded 5 years ago
MODERATORS
 

Sorry for the alarming title but, Admins for real, go set up Anubis.

For context, Anubis is essentially a gatekeeper/rate limiter for small services. From them:

(Anubis) is designed to help protect the small internet from the endless storm of requests that flood in from AI companies. Anubis is as lightweight as possible to ensure that everyone can afford to protect the communities closest to them.

It puts forward a challenge that must be solved in order to gain access, and judges how trustworthy a connection is. For the vast majority of real users they will never notice, or will notice a small delay accessing your site the first time. Even smaller scrapers may get by relatively easily.

For big scrapers though, AI and trainers, they get hit with computational problems that waste their compute before being let in. (Trust me, I worked for a company that did "scrape the internet", and compute is expensive and a constant worry for them, so win win for us!)

Anubis ended up taking maybe 10 minutes to set up. For Lemmy hosters you literally just point your UI proxy at Anubis and point Anubis to Lemmy UI. Very easy and slots right in, minimal setup.

These graphs are since I turned it on less than an hour ago. I have a small instance, only a few people, and immediately my CPU usage has gone down and my requests per minute have gone down. I have already had thousands of requests challenged, I had no idea I was being scraped this much! You can see they're backing off in the charts.

(FYI, this only stops the web requests, so it does nothing to the API or federation. Those are proxied elsewhere, so it really does only target web scrapers).

you are viewing a single comment's thread
view the rest of the comments
[–] mesamunefire@piefed.social 0 points 1 month ago* (last edited 1 month ago) (1 children)

I created a honeypot that is only accessible if they click the "don't click this unless you are a bot". If they do after 3 times, poof the IP gets banned for a day. Its worked well.

Simple little flask app. Robots.text as well but only google seems to actually read that and respect it.

[–] ryannathans@aussie.zone 0 points 1 month ago (1 children)

Lmao genius, until crawlers are LLMs or similar

LLMs are extremely compute-expensive. They will never be used for large-scale web scraping