bolha.us is one of the many independent Mastodon servers you can use to participate in the fediverse.
We're a Brazilian IT Community. We love IT/DevOps/Cloud, but we also love to talk about life, the universe, and more. | Nós somos uma comunidade de TI Brasileira, gostamos de Dev/DevOps/Cloud e mais!

Server stats:

254
active users

#anubis

0 posts0 participants0 posts today

Looking at most recent logs, there are still some scrapers that fall through the cracks, and reach my backend. They use residential IPs, and user agents I can't filter. Luckily, they don't put much pressure on my infra, but... I don't like them there.

The vast majority of them are interested in my forge, so I'm thinking about deploying #Anubis there, on some URLs that humans don't usually visit.

I don't like proof-of-work things, because they penalize the legit visitor too. But there's a point where the passive defenses do not scale anymore. I can still limit the damage, though. But I've got to do some serious log digging to figure out the patterns I can shove behind Anubis.

I will also have to figure out how often the same IP address is used. As in... can I set things up in a way that if the Anubis check fails, I temporarily route that IP into a maze?

Continued thread

As my initial use case for #swad was to stop #AI #bots clogging my DSL upstream, and the #FreeBSD / #poudriere build logs they were downloading in gigabytes aren't secret at all (to the contrary, it can sometimes be helpful to share them when hunting down build issues in the community) I wonder whether I should add a module somewhat similar to #anubis[1] for "guest logins"? 🤔 Might be a lot of work though...

[1] github.com/TecharoHQ/anubis

GitHubGitHub - TecharoHQ/anubis: Weighs the soul of incoming HTTP requests using proof-of-work to stop AI crawlersWeighs the soul of incoming HTTP requests using proof-of-work to stop AI crawlers - TecharoHQ/anubis

👀 Esta mañana al comentar los problemas de Wikimedia con el scrapping, un amigo programador me han hablado del proyecto Anubis github.com/TecharoHQ/anubis/
"Es bastante sencillo y fácil de implementar en cualquier web medio seria, te cargas automáticamente cualquier scrapper (sea de IA sea de lo que sea). Además, no pueden inventar nada que haga que sea rentable el scrapping con eso puesto." #aiscraping #aiscrapers #wikimedia #anubis #iahastaenlaputasopa

GitHubGitHub - TecharoHQ/anubis: Weighs the soul of incoming HTTP requests using proof-of-work to stop AI crawlersWeighs the soul of incoming HTTP requests using proof-of-work to stop AI crawlers - TecharoHQ/anubis
Replied to Xe :verified:

@cadey My thoughts on #Anubis after encountering it multiple times as a user:
* mascot is nice, creative and intuitive to understand
* as a user of tor it works! cloudflare and others reject me as a bot, but anubis left me through, thank you
* onion services do not require anubis protection, though, right? Since they have their own proof of work system integrated by default …
blog.torproject.org/introducin

… equi-x function based on what Tor uses?
pony.social/@cadey/11423626384

blog.torproject.orgIntroducing Proof-of-Work Defense for Onion Services | Tor ProjectToday, we are officially introducing a proof-of-work (PoW) defense for onion services designed to prioritize verified network traffic as a deterrent against denial of service (DoS) attacks with the release of Tor 0.4.8.