Skip to content

Servarr Stack

Usenet ProvidersIndexers and TrackersUtilitiesMedia ManagersDownload ClientsNewshostingNewsDemonDrunkenSlugNZBGeekNZBFinderNZBPlanetNyaaJellyfinScanManagerForceManualImportIndexerForceTestRadarrSonarrSonarrAnimeSabnzbdqBittorrentUserFlaresolverrProwlarrSeerrJellyfinrequestadd media10secsolve captchaitem importedmanual import neededforce the importscan specific libraryitem addedforce-update *arr download progress (2sec)
Usenet ProvidersIndexers and TrackersUtilitiesMedia ManagersDownload ClientsNewshostingNewsDemonDrunkenSlugNZBGeekNZBFinderNZBPlanetNyaaJellyfinScanManagerForceManualImportIndexerForceTestRadarrSonarrSonarrAnimeSabnzbdqBittorrentUserFlaresolverrProwlarrSeerrJellyfinrequestadd media10secsolve captchaitem importedmanual import neededforce the importscan specific libraryitem addedforce-update *arr download progress (2sec)

The secret speed sauce here is adding webhooks to stuff to construct "callbacks". This lets manager->jellyfin->seerr run event-driven.

VM Notes

Configure to restart regularly. The Servarr stack and the download clients all seem to need this, either because of memory leaks or other build-up issues.

VPN Routing

Only route download clients behind a VPN. Otherwise you're going to have constant dropout issues from the poor IP reputation, and possibly extra issues with Wireguard's UDP + C# web controllers.

I have Sabnzbd and qBittorrent running on the same machine, double hopped and locked with the Mullvad app in lockdown mode, and locked behind an interface level Mullvad connection in PfSense.

Requests Speedhacking

Note. There are some limiting factors to how "responsive" this can get. Here they are:

  • It takes Servarr a small amount of time to query indexers and sift results before anything begins to download.
  • Sabnzbd caches file pieces locally to speed up stitching, then copies the result to the network share.
  • Jellyfin takes a small amount of time to scan things in.

Force Radarr/Sonarr Progress Refresh

Small note to remember: Radarr and Sonarr both have an api endpoint you can hit to manually refresh downloads. You can abuse this to get finer progress granularity. Seerr has this built-in already so you shouldn't need to do anything.

curl -X POST "http://RADARR_ADDRESS:7878/api/v3/command/" -H "Content-Type: application/json" -H "X-Api-Key: API_KEY_HERE" -d "{\"name\": \"RefreshMonitoredDownloads\"}"

Servarr -> Jellyfin

Set up webhooks in Servarr apps to tell Jellyfin when to rescan. The configs for this are all in the sections below.

There are two VERY important settings I've recently discovered in Jellyfin's API:

  • LibraryMonitorDelay
  • LibraryUpdateDuration

afaik these can only be updated via the API. I have them saved in Jellarr so they're applied there.

Seerr Fork

I have a fork of Seerr here: https://github.com/alchemyyy/seerr

  • fixed the hardcoded download progress refresh interval
  • fixed the download sync cron job not being second-configurable
  • added a Jellyfin webhook endpoint to receive specific item-added events

The webhook is awesome because it removes the need for Seerr to constantly be rescanning Jellyfin, just to eventually pull in one new request. I still run passive scans for reliability.

Servarr Pipeline Uncloggers

Project: https://github.com/alchemyyy/ServarrPipelineUncloggers

ForceManualImport

99.999% of the time something fails to auto import after download because Radarr/Sonarr can't be absolutely sure its the right file, it ends up being correct after all. Furthermore, the goal here is total automation so other users can request things.

IndexerForceTest

Servarr apps have extreme backoff policies about retesting indexers, which leads to long downtime.

JellyfinLibraryScanManager

Instead of waiting for Jellyfin to run a passive scan, we can manually trigger one via webhooks in Servarr apps.

If Jellyfin gets another scan request before its finished its current one, it cancels the current one and starts over. If a ton of media is being added, this can effectively block anything from getting scanned until all media is done downloading.

My solution is an intermediary endpoint that makes all scan requests raise dirty flags on libraries (this is managed within the script). Libraries currently being scanned are not rescanned until they finish.

I also have Servarr apps emit data about what got added, so we can send a targeted library scan, rather than scanning the whole server. Out of laziness to configure stuff, my script detects and maps the "root library folders" from Servarr apps to Jellyfin. A caveat with this is if someone has a fucked up nested library structure, and two libraries have the same name, and thus the mapping will break.

Servarr Apps

atow I use Radarr, Sonarr, Prowlarr, Flaresolverr

https://github.com/alchemyyy/Phlegmarr

REDACTED

Download Clients

Sabnzbd

REDACTED

Simple enough to set up I won't cover it. Here are some tweaks under "special" to make stuff faster:

  • downloader_sleep_time 1
  • receive_threads 16
  • direct_unpack_threads 6 (or more)

Providers

I use newshosting and newsdemon atow. I have an active newsgroupninja account but I'm not going to renew it since newshosting is the backend for it. Look up the latest usenet map for ideas on who to go with. I picked Newsdemon as a secondary since they're not on the Omicron backend. The black friday sale window is the absolute best time to purchase a provider. The sales are very good. Make sure you punch in the max # connections the provider gives you into sabnzbd so you're running at full speed.

qBittorrent

Browser Extension

Much easier when doing things manually. https://github.com/jgkme/Add-Remote-Torrent this project gives you a chrome extension to remotely connect to qbittorrent, and clicking magnet or torrent links on the web will send them off to the client. you must disable CSRF protection in Options->WebUI->Security in qBittorrent for this to work.

download it, get it going. Here's how to set it up properly.

REDACTED

Some manual settings (outdated)

Connection:

  • Uncheck use UPnP
  • Global max conn 999
  • Max conn per torr 999

BitTorrent

  • Disable Local Peer Discovery
  • Enable anonymous mode

Advanced

  • Memory priority: normal
  • Refresh interval: 250
  • Async I/O threads: 20
  • File pool: 1000
  • Outstanding memory: 1024MB
  • Coalesce rw: Check
  • set Network interface to THE VPN ADAPTER INTERFACE. You can leave Optional IP bind to "all".

WebUI

  • Enable it and set authentication to whatever you want.

Qobuz/Tidal downloaders

tidal-dl-ng from radicle since it got banned from github. done.