How to Self-Host Baserow with Docker Compose
What Is Baserow?
Baserow is an open-source no-code database platform — the closest self-hosted equivalent to Airtable. It combines the familiarity of a spreadsheet with the power of a database: create tables with typed fields (text, number, date, file, link, formula, lookup), build views (grid, gallery, kanban, form, calendar), and collaborate in real-time.
Unlike NocoDB, which overlays a UI on existing databases, Baserow is a standalone application with its own storage. This makes it simpler to set up but means you can’t point it at an existing PostgreSQL database and get a spreadsheet interface. Baserow’s all-in-one Docker image bundles everything (PostgreSQL, Redis, Celery workers, Caddy) into a single container.
Prerequisites
- A Linux server (Ubuntu 22.04+ recommended)
- Docker and Docker Compose installed (guide)
- 2 GB of free RAM (minimum)
- 10 GB of free disk space
- A domain name (optional, for HTTPS access)
Docker Compose Configuration
Simple Setup (All-in-One)
The all-in-one image is the fastest way to get Baserow running. It bundles PostgreSQL, Redis, Celery, and Caddy into a single container:
services:
baserow:
image: baserow/baserow:2.1.1
container_name: baserow
restart: unless-stopped
ports:
- "80:80"
- "443:443"
environment:
# Your instance's public URL (include protocol)
BASEROW_PUBLIC_URL: "http://localhost"
# Uncomment for production with a domain:
# BASEROW_PUBLIC_URL: "https://baserow.yourdomain.com"
# BASEROW_CADDY_ADDRESSES: ":443"
volumes:
- baserow_data:/baserow/data
volumes:
baserow_data:
Start the stack:
docker compose up -d
Production Setup (Multi-Service)
For production with better performance and scalability, separate the services:
services:
baserow-backend:
image: baserow/backend:2.1.1
container_name: baserow-backend
restart: unless-stopped
environment:
DATABASE_HOST: baserow-db
DATABASE_NAME: baserow
DATABASE_USER: baserow
DATABASE_PASSWORD: CHANGE_THIS_DB_PASSWORD
DATABASE_PORT: "5432"
REDIS_HOST: baserow-redis
REDIS_PORT: "6379"
BASEROW_PUBLIC_URL: "https://baserow.yourdomain.com"
SECRET_KEY: CHANGE_THIS_SECRET_KEY
BASEROW_JWT_SIGNING_KEY: CHANGE_THIS_JWT_KEY
# Email configuration (optional)
# EMAIL_SMTP: "true"
# EMAIL_SMTP_HOST: "smtp.example.com"
# EMAIL_SMTP_PORT: "587"
# EMAIL_SMTP_USE_TLS: "true"
# EMAIL_SMTP_USER: "[email protected]"
# EMAIL_SMTP_PASSWORD: "smtp_password"
# FROM_EMAIL: "[email protected]"
volumes:
- baserow_media:/baserow/media
depends_on:
baserow-db:
condition: service_healthy
baserow-redis:
condition: service_started
ports:
- "8000:8000"
baserow-web-frontend:
image: baserow/web-frontend:2.1.1
container_name: baserow-frontend
restart: unless-stopped
environment:
BASEROW_PUBLIC_URL: "https://baserow.yourdomain.com"
PRIVATE_BACKEND_URL: "http://baserow-backend:8000"
ports:
- "3000:3000"
depends_on:
- baserow-backend
baserow-celery-worker:
image: baserow/backend:2.1.1
container_name: baserow-celery-worker
restart: unless-stopped
command: celery-worker
environment:
DATABASE_HOST: baserow-db
DATABASE_NAME: baserow
DATABASE_USER: baserow
DATABASE_PASSWORD: CHANGE_THIS_DB_PASSWORD
DATABASE_PORT: "5432"
REDIS_HOST: baserow-redis
REDIS_PORT: "6379"
SECRET_KEY: CHANGE_THIS_SECRET_KEY
volumes:
- baserow_media:/baserow/media
depends_on:
- baserow-backend
baserow-celery-export-worker:
image: baserow/backend:2.1.1
container_name: baserow-celery-export
restart: unless-stopped
command: celery-exportworker
environment:
DATABASE_HOST: baserow-db
DATABASE_NAME: baserow
DATABASE_USER: baserow
DATABASE_PASSWORD: CHANGE_THIS_DB_PASSWORD
DATABASE_PORT: "5432"
REDIS_HOST: baserow-redis
REDIS_PORT: "6379"
SECRET_KEY: CHANGE_THIS_SECRET_KEY
volumes:
- baserow_media:/baserow/media
depends_on:
- baserow-backend
baserow-celery-beat:
image: baserow/backend:2.1.1
container_name: baserow-celery-beat
restart: unless-stopped
command: celery-beat
environment:
DATABASE_HOST: baserow-db
DATABASE_NAME: baserow
DATABASE_USER: baserow
DATABASE_PASSWORD: CHANGE_THIS_DB_PASSWORD
DATABASE_PORT: "5432"
REDIS_HOST: baserow-redis
REDIS_PORT: "6379"
SECRET_KEY: CHANGE_THIS_SECRET_KEY
depends_on:
- baserow-backend
baserow-db:
image: postgres:16
container_name: baserow-db
restart: unless-stopped
environment:
POSTGRES_DB: baserow
POSTGRES_USER: baserow
POSTGRES_PASSWORD: CHANGE_THIS_DB_PASSWORD
volumes:
- baserow_pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U baserow"]
interval: 10s
timeout: 5s
retries: 5
baserow-redis:
image: redis:7.4
container_name: baserow-redis
restart: unless-stopped
volumes:
- baserow_redis:/data
volumes:
baserow_media:
baserow_pgdata:
baserow_redis:
Generate secure keys:
# Secret key for Django
openssl rand -hex 32
# JWT signing key
openssl rand -hex 32
# Database password
openssl rand -hex 24
Replace all CHANGE_THIS_* values with your generated secrets.
Initial Setup
- Open
http://your-server-ip(all-in-one) or configure your reverse proxy (multi-service) - Click Sign up to create your first account — this becomes the admin
- Create your first Database → add a Table
- Add fields by clicking the + button in the column header
- Available field types: Text, Long text, Number, Boolean, Date, File, Single select, Multiple select, Link to table, Formula, Lookup, Count, Rollup, and more
Configuration
Key Environment Variables
| Variable | Default | Description |
|---|---|---|
BASEROW_PUBLIC_URL | (required) | Full URL including protocol |
SECRET_KEY | (auto-generated) | Django secret key for sessions/tokens |
BASEROW_JWT_SIGNING_KEY | (auto-generated) | JWT token signing key |
BASEROW_MAX_IMPORT_FILE_SIZE_MB | 512 | Max CSV/JSON import size |
BASEROW_MAX_SNAPSHOTS_PER_GROUP | -1 | Snapshot limit per workspace (-1 = unlimited) |
BASEROW_FILE_UPLOAD_SIZE_LIMIT_MB | 0 | Per-file upload limit (0 = use Django default) |
BASEROW_AMOUNT_OF_WORKERS | 1 | Gunicorn workers for backend API |
DISABLE_ANONYMOUS_PUBLIC_VIEW_WS_CONNECTIONS | false | Disable WebSocket for anonymous viewers |
BASEROW_TRIGGER_SYNC_TEMPLATES_AFTER_MIGRATION | true | Sync template library after upgrades |
Disabling Signups
To prevent public registration after creating your admin account:
environment:
BASEROW_PUBLIC_URL: "https://baserow.yourdomain.com"
# Add this to disable new signups
BASEROW_DISABLE_SIGNUPS: "true"
Existing accounts remain active. New users must be invited by an admin.
Reverse Proxy
All-in-One with Built-in Caddy
The all-in-one image includes Caddy for automatic HTTPS. Set BASEROW_CADDY_ADDRESSES and BASEROW_PUBLIC_URL:
environment:
BASEROW_PUBLIC_URL: "https://baserow.yourdomain.com"
BASEROW_CADDY_ADDRESSES: ":443"
Caddy handles Let’s Encrypt certificate provisioning automatically.
Multi-Service with External Reverse Proxy
Point your reverse proxy to the frontend (port 3000) for web traffic and the backend (port 8000) for API calls. For Nginx Proxy Manager, create two proxy hosts or use path-based routing.
Backup
What to Back Up
| Data | Location | Priority |
|---|---|---|
| PostgreSQL database | baserow_pgdata volume | Critical |
| Uploaded files/media | baserow_media volume | Important |
| Redis data | baserow_redis volume | Low (task queue state, auto-recovers) |
Database Backup
# Dump the database
docker exec baserow-db pg_dump -U baserow baserow > baserow_backup.sql
# Restore from dump
cat baserow_backup.sql | docker exec -i baserow-db psql -U baserow baserow
All-in-One Backup
For the all-in-one image, all data lives in /baserow/data:
docker compose stop
tar czf baserow-backup.tar.gz -C /var/lib/docker/volumes/baserow_data/_data .
docker compose start
Troubleshooting
”CSRF Verification Failed” After Domain Change
Symptom: Login fails with CSRF error after changing BASEROW_PUBLIC_URL.
Fix: Clear your browser cookies for the Baserow domain, or update BASEROW_PUBLIC_URL to exactly match the URL in your browser’s address bar (including protocol and port).
File Uploads Failing
Symptom: Uploading files to file fields returns an error.
Fix: Check that the baserow_media volume is writable and has sufficient space. For the multi-service setup, ensure all services sharing the media volume have the same volume mount.
Slow Performance with Large Tables
Symptom: Tables with 10,000+ rows become slow to load and filter.
Fix: Increase BASEROW_AMOUNT_OF_WORKERS (default 1) to match your CPU cores. Add more RAM — PostgreSQL uses memory aggressively for query caching. Consider the multi-service setup if using the all-in-one image.
Celery Tasks Not Processing
Symptom: Exports never complete. Webhooks don’t fire. Trash isn’t cleaned up.
Fix: Verify the celery-worker and celery-beat containers are running:
docker compose ps
docker compose logs baserow-celery-worker
Ensure Redis is accessible and the REDIS_HOST environment variable is correct.
Resource Requirements
- RAM: 1 GB minimum (all-in-one), 2 GB recommended. Multi-service needs 3 GB+.
- CPU: Low-Medium. Single core handles moderate usage.
- Disk: 5 GB for application data. File storage depends on uploads.
Verdict
Baserow is the best Airtable replacement for teams that value a polished UI and real-time collaboration. Its all-in-one Docker image makes initial setup trivial — one container, one command, working spreadsheet-database. The multi-service deployment scales well for production use.
Choose Baserow over NocoDB if you want the closest UX to Airtable and plan to build your database from scratch. Choose NocoDB if you need to connect to existing databases or want the ability to overlay a spreadsheet UI on PostgreSQL/MySQL tables you already have.
Related
Get self-hosting tips in your inbox
New guides, comparisons, and setup tutorials — delivered weekly. No spam.