this post was submitted on 14 Dec 2025
57 points (95.2% liked)

Selfhosted

53878 readers
637 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

  7. No low-effort posts. This is subjective and will largely be determined by the community member reports.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

cross-posted from: https://discuss.online/post/32165111

I realize my options are limited, but what about any robots.txt style steps? Thanks for any suggestions.

all 19 comments
sorted by: hot top controversial new old
[–] potatopotato@sh.itjust.works 26 points 1 week ago (1 children)

Currently Anubis seems to be the standard for slowing down scrapers

https://github.com/TecharoHQ/anubis

There are also various poison and tarpit systems which will serve scrapers infinite garbage text or data designed to aggressively corrupt the models they're training. Basically you can be as aggressive as you want. Your site will get scraped and incorporated into someone's model at the end of the day, but you can show them down and make it hurt.

[–] meltedcheese@c.im 9 points 1 week ago

@potatopotato @selfhosted Black Ice exists. Software is hand-to-hand combat. The most #cyberpunk sentence I’ve read today:

“There are also various poison and tarpit systems which will serve scrapers infinite garbage text or data designed to aggressively corrupt the models they’re training. “

[–] lambalicious 25 points 1 week ago (2 children)

0.- Take it out of the public.

[–] DragonBard@ttrpg.network 3 points 1 week ago

I shut down one of my sites once I realized the massive traffic spike was all AI.

[–] kiol@discuss.online 2 points 1 week ago (1 children)

Right, but if it has to stay public would you simply do nothing?

[–] lambalicious 1 points 1 week ago

Nah. But if what you want is to prevent rather than palliate or delay (AIs will get throug Anubis, in fact from what I read some of them already do), then pretty much your only option is real-person authentication, so that if stuff does get leaked, you have a discrete list of people who to hold accountable.

[–] Auth@lemmy.world 10 points 1 week ago

You could put your website behind a cloudflare anti bot check. But realistically, your website is public facing and these bots are scraping the public web. They will eventually get the data from your website.

[–] TrippyHippyDan@lemmy.world 8 points 1 week ago (1 children)
[–] DaGeek247@fedia.io 2 points 1 week ago

Another one; https://iocaine.madhouse-project.org/

But yeah, OP. You can't reliably stop web scrapers from stealing your data. You can only make more difficult and costly to do so, at the expense of your own server, and in the case of anubis, at the expense of your real users.

I plan on switching to a RPI hosted website at some point, so I can add either iocaine or nepanthes to my website. Might as well make most of the data from my website poison to all the scrapers when I get the chance.

[–] talkingpumpkin@lemmy.world 6 points 1 week ago
[–] brewery@feddit.uk 4 points 1 week ago (1 children)

Another option to reduce (but not eliminate) this traffic is a country limit. In cloudflare you can set a manual security rule to do this. There are self hosted options too but harder to setup. It depends what country you are and where your users are based. My website is a business one so I only allow my own country (and if on holiday I might open that country if I need to check it's working, although usually I just use a paid vpn back to my country so no need). You can also block specific countries. So many of my blocked requests are from USA, China, Russia etc

[–] moftasa@lemmy.ml 1 points 1 week ago* (last edited 1 week ago) (1 children)

Please don't do this. It is incredibly hostile to other people around the world. It is the recreation of broders but on the internet. Even if your website is a local business it is still useful for other people around the world to browse, compare things, get ideas. You never know but may be other people from around the world want to plan ahead before they travel or move. Or import your product or collaborate. Perhaps others from your country want to browse your site while travelling.

[–] brewery@feddit.uk 3 points 1 week ago

In an ideal world this should be the case but I can't afford to do this practically and my business is a service, based on UK laws and requirements, available to UK residents only. The website is for information only and nothing is new or interesting to anybody but a few potential clients, and if theyre looking at it on holiday, theres something wrong with them! Nobody is going to reach out based on my website from abroad and if they did, I would not trust them at all. They would reach out through personal contacts or linkedin. If the bots stop spamming my site or server, I can stop limiting it.

[–] irmadlad@lemmy.world 3 points 1 week ago

I'm wondering if you could run CrowdSec on the server and manually block the offenders if they are not already in the community blocklists.

[–] Nephalis@discuss.tchncs.de 2 points 1 week ago (1 children)

Isn't fail2ban a possibility too? I created a filter for chatgpt and some others, and it feels like its working. My radicale server is my only free acessable service but it comes with a small webgui and so the bots showed up. I have no clue if the bot gets a fraction of your site each time it shows up, but seemingly the ban happens within 300ms when I remember correct. So it wouldn't be that much of information...

When setting the retry to 1 it will ban at the first sight.

[–] JustTesting@lemmy.hogru.ch 2 points 1 week ago (1 children)

A big issue is that this works for bots that announce themselves as such, but there's lots that pretend to be regular users, with fake user agents and ips selected from a random pool with each ip only sending like 1-3 request/day, but overall many thousands of requests. In my experience a lot of them are from huawei and tencent cloud/ASN

[–] Nephalis@discuss.tchncs.de 2 points 1 week ago

Yes, if that is true (and I am not that suprised about it) it is nearly impossible to block them this way.

[–] SmokeyDope@piefed.social 0 points 1 week ago

Anubis is your friend