34
submitted 2 weeks ago by dessalines@lemmy.ml to c/lemmy@lemmy.ml

I've recently added anubis to lemmy.ml, and it seems to be working well.

I have a PR to add anubis to lemmy-ansible (our main installation method), and I could use some help tweaking / optimizing its botPolicy.yaml config, for federated services.

Anyone with experience running anubis, this would be much appreciated.

top 15 comments
sorted by: hot top controversial new old
[-] gandalf_der_12te@discuss.tchncs.de 3 points 16 hours ago

i think feddit.org and lemmy.dbzer0.com both use it

in feddit.org's case, the anubis loading screen displays for waaay too long. i've told the feddit.org admins repeatedly but got no response.

i'm not sure whether lemmy.dbzer0.com still uses it, but i think i remember seeing the loading screen there too. maybe they just reduced the loading time so much that i can't see it anymore.

[-] otter@lemmy.ca 11 points 2 weeks ago

https://lemmy.nz/ and https://quokk.au/ are running Anubis, and so those admins may be able to offer insight :)

[-] bjoern_tantau@swg-empire.de 10 points 2 weeks ago* (last edited 2 weeks ago)

I regularly encounter images not loading from quock.au. No idea if they've got that under control now but that is the most visible issue every instance fights with. Gonna be great when we have a recommended configuration for Lemmy.

[-] dessalines@lemmy.ml 7 points 2 weeks ago

Yep, essentially the botPolicy.yaml there could be a collectively developed anubis config, based on what works best.

[-] Alabaster_Mango@lemmy.ca 3 points 2 weeks ago

Do we have an equivalent service on lemmy.ca? (I don't know anything about net security and am just curious)

[-] otter@lemmy.ca 4 points 2 weeks ago

We are not running Anubis, although we do block a large number of AI/LLM companies through IP addresses. Each time we block a new one, it makes a noticeable difference in the performance graphs.

[-] julian@activitypub.space 5 points 2 weeks ago* (last edited 2 weeks ago)

Sure. I have found that the default botPolicy works fine for blocking the AI bots, but blocks federation.

At the reverse proxy level:

if ($request_method = POST) {
    proxy_pass http://nodebb; 
}

Because Anubis can't filter by HTTP method, unless I am mistaken. This just broadly allows all incoming activities. If you want to get specific, limit it to your shared inbox or individual user inboxes via regular expression or something. I didn't find that it was necessary.

As for botPolicies.yaml

  # Allow /inbox
  - name: allow-ap-headers
    headers_regex:
      Accept: application/ld\+json; profile="https://www.w3.org/ns/activitystreams"
      Accept: application/activity\+json
    action: ALLOW

  - name: allow-assets
    path_regex: /assets
    action: ALLOW

The former allows those specific AP headers (it is naive, some AP impls. send slight variations of those two headers.

The latter allows our uploads.

[-] dessalines@lemmy.ml 4 points 2 weeks ago

Lemmy has a separated UI and backend hosted on different ports, so its trivial for us to just only use anubis for the front end. We couldn't put it in front of everything due to apps also.

[-] adam@doomscroll.n8e.dev 5 points 2 weeks ago

It would be nice to have a much more aggressive anti-bot stance for communities/content that aren't local. If google or any other crawler wants to crawl c/lemmy@lemmy.ml then it should do it on the source instance. Doing it on mine makes no sense.

[-] bjoern_tantau@swg-empire.de 1 points 2 weeks ago

Well behaved bots should see the origin in the headers and only crawl those.

[-] ex_06@slrpnk.net 3 points 2 weeks ago
[-] poVoq@slrpnk.net 6 points 2 weeks ago

This is the botPolicy.yaml that we use on slrpnk.net :

bots:
  - name: known-crawler
    action: CHALLENGE
    expression:
      # https://anubis.techaro.lol/docs/admin/configuration/expressions
      all:
        # Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/125.0.0.0 Safari/537.36
        - userAgent.contains("Macintosh; Intel Mac") && userAgent.contains("Chrome/125.0.0.0") # very old chrome?
        - missingHeader(headers, "Sec-Ch-Ua") # a valid chrome has this header
    challenge:
      difficulty: 6
      algorithm: slow

    # Assert behaviour that only genuine browsers display.
    # This ensures that Chrome or Firefox versions
  - name: realistic-browser-catchall
    expression:
      all:
        - '"User-Agent" in headers'
        - '( userAgent.contains("Firefox") ) || ( userAgent.contains("Chrome") ) || ( userAgent.contains("Safari") )'
        - '"Accept" in headers'
        - '"Sec-Fetch-Dest" in headers'
        - '"Sec-Fetch-Mode" in headers'
        - '"Sec-Fetch-Site" in headers'
        - '"Accept-Encoding" in headers'
        - '( headers["Accept-Encoding"].contains("zstd") || headers["Accept-Encoding"].contains("br") )'
        - '"Accept-Language" in headers'
    action: CHALLENGE
    challenge:
      difficulty: 2
      algorithm: fast

  - name: generic-browser
    user_agent_regex: (?i:mozilla|opera)
    action: CHALLENGE
    challenge:
      difficulty: 4
      algorithm: fast

status_codes:
  CHALLENGE: 202
  DENY: 406

dnsbl: false

#store:
#  backend: valkey
#  parameters:
#    url: redis://valkey-primary:6379/0

I think I just took it over from Codeberg.org back from when they still used Anubis. Nothing really relevant to Lemmy specifically and it is only in front of the frontends, not the s2s federation API.

It seems though like there are some crawlers that use 3rd party hosted alternative frontends to crawl (unintentionally?) through the federation API, so something in front of that would be useful I guess.

[-] olof@lemmy.ml 2 points 2 weeks ago

Not Lemmy specific, but I wanted to set up Anubis in a setup where I have one reverse proxy (nginx) handling many different domains. Last time I looked, it seemed to need one Anubis instance per domain. Is that still the case? Goal was to have a single Anubis instance and route all through it

[-] dessalines@lemmy.ml 2 points 2 weeks ago

I'm not an expert, but I think the fact that you need to set a TARGET in anubis, IE, where does anubis send you after passing it, means that you do need separate anubis's for each site.

[-] poVoq@slrpnk.net 2 points 2 weeks ago* (last edited 2 weeks ago)

You could probably put Anubis in front of your reverse-proxy, but then you need something else in front of it that handles TLS certificates. So maybe something like this: HAProxy->Anubis->Nginx.

this post was submitted on 09 Apr 2026
34 points (100.0% liked)

Lemmy

14580 readers
15 users here now

Everything about Lemmy; bugs, gripes, praises, and advocacy.

For discussion about the lemmy.ml instance, go to !meta@lemmy.ml.

founded 6 years ago
MODERATORS