Self-Hosting Bitmagnet with Docker Compose
What Is Bitmagnet?
Bitmagnet is a self-hosted BitTorrent DHT crawler and search engine. It continuously discovers torrents from the distributed hash table network, classifies them by content type (movies, TV, music, software), and provides a Torznab-compatible API that integrates directly with Sonarr, Radarr, and other *arr stack applications. Think of it as your own private torrent index — no trackers, no accounts, no third-party dependency.
Updated March 2026: Verified with latest Docker images and configurations.
- Official site: bitmagnet.io
- Source code: github.com/bitmagnet-io/bitmagnet
- License: MIT
Prerequisites
- A Linux server (Ubuntu 22.04+ recommended)
- Docker and Docker Compose installed (guide)
- 2 GB of free RAM (minimum — 4 GB recommended for large indexes)
- 20 GB of free disk space (DHT metadata grows continuously)
- Port 3334 open for DHT communication (TCP and UDP)
Docker Compose Configuration
Create a docker-compose.yml file:
services:
bitmagnet:
image: ghcr.io/bitmagnet-io/bitmagnet:v0.10.0
container_name: bitmagnet
ports:
- "3333:3333" # Web UI and API
- "3334:3334/tcp" # BitTorrent DHT (TCP)
- "3334:3334/udp" # BitTorrent DHT (UDP)
environment:
- POSTGRES_HOST=db
- POSTGRES_PASSWORD=${POSTGRES_PASSWORD}
- POSTGRES_DB=bitmagnet
depends_on:
db:
condition: service_healthy
command:
- worker
- run
- --keys=http_server
- --keys=queue_server
- --keys=dht_crawler
restart: unless-stopped
db:
image: postgres:16-alpine
container_name: bitmagnet-db
environment:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
POSTGRES_DB: bitmagnet
shm_size: 1g
volumes:
- bitmagnet-db:/var/lib/postgresql/data
restart: unless-stopped
healthcheck:
test: ["CMD-SHELL", "pg_isready -U postgres"]
interval: 10s
timeout: 5s
retries: 5
volumes:
bitmagnet-db:
Create a .env file alongside:
# PostgreSQL password — generate with: openssl rand -hex 16
POSTGRES_PASSWORD=change_me_to_a_strong_password
Start the stack:
docker compose up -d
Initial Setup
- Wait 30-60 seconds for the DHT crawler to initialize
- Open
http://your-server-ip:3333to access the web interface - The DHT crawler begins automatically — you’ll see torrents appearing within minutes
- The first hour typically discovers 10,000-50,000 torrents; after 24 hours, expect 500,000+
- No manual configuration is needed — Bitmagnet crawls the DHT network autonomously
Configuration
TMDB Integration (Recommended)
For automatic movie and TV show classification, add a TMDB API key. Get a free key at themoviedb.org:
environment:
- TMDB_API_KEY=your_tmdb_api_key_here
With TMDB enabled, Bitmagnet identifies movies and TV shows by name matching and enriches them with metadata (poster, year, rating, genre).
*arr Stack Integration
Bitmagnet exposes a Torznab-compatible API. Add it to Sonarr or Radarr:
- In Sonarr/Radarr, go to Settings → Indexers → Add → Torznab
- Set the URL to
http://bitmagnet:3333/torznab - No API key is required (unless you’ve configured authentication)
- Set categories as needed (Sonarr: 5000-5999 for TV, Radarr: 2000-2999 for Movies)
- Test and save
Crawler Tuning
Control how aggressively Bitmagnet crawls the DHT network. By default, it uses moderate settings. For faster discovery on powerful hardware:
environment:
- DHT_CRAWLER_SCALING_FACTOR=10
- DHT_CRAWLER_SAVE_FILES_THRESHOLD=200
Warning: Higher scaling factors increase CPU, memory, and database write load significantly. Start with defaults and increase gradually.
GraphQL API
Bitmagnet provides a GraphQL API at http://your-server:3333/graphql for programmatic access. Use the built-in GraphQL playground to explore available queries:
{
torrentContent(query: { queryString: "ubuntu" }) {
totalCount
items {
torrent {
name
size
filesCount
}
}
}
}
Reverse Proxy
If exposing Bitmagnet externally, place it behind a reverse proxy. With Nginx Proxy Manager, point your domain to bitmagnet:3333. Note that port 3334 (DHT) should NOT go through the reverse proxy — it needs direct UDP access.
See Reverse Proxy Setup for full configuration guides.
Backup
The PostgreSQL database contains all discovered torrent metadata:
# Create a database dump
docker compose exec db pg_dump -U postgres bitmagnet > bitmagnet-backup-$(date +%Y%m%d).sql
# Restore from backup
cat bitmagnet-backup.sql | docker compose exec -T db psql -U postgres bitmagnet
Note: the database can grow to tens of gigabytes after extended crawling. Compressed backups (pg_dump | gzip) are recommended. See Backup Strategy.
Troubleshooting
No torrents appearing after startup
Symptom: Web interface shows 0 torrents after 10+ minutes.
Fix: Ensure port 3334 is open for both TCP and UDP traffic. The DHT crawler needs bidirectional communication. Check your firewall: sudo ufw allow 3334/tcp && sudo ufw allow 3334/udp. Also verify the crawler is running in logs: docker compose logs bitmagnet | grep "dht".
Database growing too fast
Symptom: PostgreSQL data volume consuming 50+ GB within a week.
Fix: Bitmagnet stores metadata for every discovered torrent. If disk space is a concern, lower the scaling factor or run periodic cleanup. The project is actively adding retention policies — check the latest release notes.
High CPU usage from DHT crawler
Symptom: Container using 100%+ CPU continuously.
Fix: Reduce the scaling factor in your environment: DHT_CRAWLER_SCALING_FACTOR=1. The default is moderate, but on low-powered hardware (2 cores or less), even the default can be demanding.
PostgreSQL shared memory errors
Symptom: PostgreSQL crashes with could not resize shared memory segment errors.
Fix: Ensure shm_size: 1g is set on the PostgreSQL container. The default Docker shared memory (64 MB) is insufficient for Bitmagnet’s query patterns.
Resource Requirements
- RAM: 1 GB idle (bitmagnet) + 1 GB (PostgreSQL with 1 GB shared memory) = 2 GB minimum
- CPU: Medium-High — DHT crawling is CPU-intensive, especially at higher scaling factors
- Disk: 10 GB initially, growing 1-5 GB/day depending on crawl rate. Plan for 50+ GB long-term.
Verdict
Bitmagnet is the best self-hosted torrent indexer for *arr stack users who want to eliminate dependency on public tracker sites. The DHT crawler works autonomously — no accounts, no APIs, no tracker whitelists. The Torznab integration with Sonarr and Radarr is seamless. The main drawback is resource consumption: it needs a dedicated PostgreSQL instance with generous shared memory, and the database grows continuously. Run it on a machine with at least 4 GB RAM and 100+ GB disk if you plan to keep it running long-term. Still in alpha, so expect occasional breaking changes between versions.
Related
Get self-hosting tips in your inbox
Get the Docker Compose configs, hardware picks, and setup shortcuts we don't put in articles. Weekly. No spam.
Comments