Frigate vs Motion: Self-Hosted Video Surveillance

Unlike Traditional Motion Detection, Frigate Knows What Moved

Motion is a proven Linux tool for detecting pixel changes in camera feeds — it triggers when something moves. Frigate uses AI object detection to identify what moved — a person, car, dog, or package. That distinction changes everything about how useful your camera system actually is. One floods you with alerts every time a tree sways; the other only notifies you when a person walks up to your door.

Updated February 2026: Verified with latest Docker images and configurations.

Feature Comparison

FeatureFrigateMotion
Detection methodAI object detection (YOLO/SSD)Pixel-change motion detection
Hardware accelerationGoogle Coral TPU, Intel OpenVINO, NVIDIACPU only
Web UIModern React-based with live viewBasic web control panel
RecordingContinuous + event-basedEvent-triggered only
LicenseMITGPL-2.0
LanguagePython + GoC
Home AssistantNative MQTT integrationManual (webhook/script)
RTSP supportYes (primary input)Yes + V4L2, MJPEG
Mobile appVia Home Assistant companionNo
Object classificationPerson, car, dog, cat, package, etc.None (motion only)
Face recognitionNo (use Viseron or Double Take)No
ZonesAI-aware detection zonesMotion mask regions
Latest versionv0.14.1 (2025)v4.7.1 (2025)
Docker imageghcr.io/blakeblackshear/frigatemotionproject/motion (stale)

Quick Verdict

Frigate wins for anyone building a security camera system. Its AI detection eliminates the false-positive nightmare that makes raw motion detection unusable in real-world environments (wind, shadows, animals, headlights). Motion still has a place for simple, single-camera monitoring where you genuinely want to know about any movement — think a locked room or a wildlife camera. But for home security, Frigate is the modern answer.

Installation Complexity

Frigate requires a YAML config file defining cameras, detectors, and recording options. The Docker setup needs device access for hardware acceleration (Coral TPU, GPU) and storage for recordings:

services:
  frigate:
    image: ghcr.io/blakeblackshear/frigate:0.17.0
    restart: unless-stopped
    privileged: true
    shm_size: 256mb
    ports:
      - "5000:5000"   # Web UI
      - "8554:8554"   # RTSP restream
      - "8555:8555"   # WebRTC
    volumes:
      - ./config:/config
      - ./storage:/media/frigate
      - /etc/localtime:/etc/localtime:ro
    devices:
      - /dev/bus/usb:/dev/bus/usb  # Coral USB TPU
    environment:
      FRIGATE_RTSP_PASSWORD: changeme

You also need a config/config.yml defining your cameras:

cameras:
  front_door:
    ffmpeg:
      inputs:
        - path: rtsp://user:pass@camera-ip:554/stream
          roles: ["detect", "record"]
    detect:
      width: 1280
      height: 720
detectors:
  coral:
    type: edgetpu
    device: usb

Motion uses a flat config file (motion.conf) and needs minimal Docker setup:

services:
  motion:
    # Motion project does not publish versioned Docker tags — :latest is the only option
    image: motionproject/motion:latest
    restart: unless-stopped
    ports:
      - "8080:8080"   # Web control
      - "8081:8081"   # Stream
    volumes:
      - ./config:/etc/motion
      - ./recordings:/var/lib/motion
      - /etc/localtime:/etc/localtime:ro

Motion is simpler to get running. But the Motion Docker image hasn’t been updated since 2020 — you may need a community image or install from packages. Frigate’s Docker image is actively maintained with monthly releases.

Full setup guide: Self-Host Frigate

Performance and Resource Usage

MetricFrigateMotion
Idle RAM (1 camera)~300–500 MB~50–100 MB
Under load (4 cameras)~1–2 GB~200–400 MB
CPU (with Coral TPU)Low (detection offloaded)N/A
CPU (without TPU)Very high (software AI)Low–Medium
GPU supportIntel QSV, NVIDIA CUDA, VAAPINone
Disk (recordings/day)5–20 GB per camera (H.265)1–5 GB per camera (event only)
Startup time~10–20 seconds~2 seconds

Motion is dramatically lighter. It’s a C program that does simple pixel math — it runs on a Raspberry Pi Zero without breaking a sweat. Frigate’s AI models need real compute: without a Coral TPU ($30–60), a single camera can peg a CPU at 100%. With a Coral, detection drops to near-zero CPU usage but you need the hardware.

Detection Quality

This is where Frigate justifies its complexity.

Motion detection (Motion) triggers on pixel changes exceeding a configurable threshold. Problems:

  • Tree branches swaying → alert
  • Shadows from clouds → alert
  • Car headlights sweeping across a wall → alert
  • Spider web in front of lens → alert
  • Actual burglar → alert (same as the spider)

You end up either drowning in false positives or tuning sensitivity so low that real events get missed.

Object detection (Frigate) classifies what’s in the frame. It can tell a person from a car from a dog from a shadow. You configure rules like “only alert on people in the driveway zone between 10 PM and 6 AM.” False positive rates drop from dozens per hour to near zero.

Detection ScenarioMotion ResultFrigate Result
Person at doorAlert (motion detected)Alert: “person” detected
Tree swayingAlert (motion detected)Ignored (no object)
Car headlightsAlert (motion detected)Ignored (no object)
Cat crossing yardAlert (motion detected)Optional: “cat” detected
Package deliveredAlert (motion detected)Alert: “package” detected

Home Assistant Integration

Frigate integrates natively with Home Assistant via MQTT. You get:

  • Camera entities with live view and recordings
  • Binary sensors per object type (person detected, car detected)
  • Automation triggers (“when a person is detected in the backyard, turn on lights”)
  • Event notifications with snapshots pushed to your phone

Motion requires manual integration — typically via webhooks, shell scripts, or MQTT publishers that you wire up yourself. It works, but it’s not a native experience.

Community and Development

MetricFrigateMotion
GitHub stars~19K~4K
First release20202000
Latest releasev0.14.1 (2025)v4.7.1 (2025)
Release cadenceMonthlyQuarterly
Docker imageActively maintainedStale (2020 on Docker Hub)
DocumentationComprehensive (docs.frigate.video)Basic (motion-project.github.io)
CommunityVery active (Discord, GitHub)Moderate

Both projects are actively maintained at the source level. The critical difference is that Frigate’s Docker image tracks releases closely, while Motion’s official Docker image is years behind. You’d need a community Docker image or build your own for current Motion versions.

Use Cases

Choose Frigate If…

  • You’re building a home security system with multiple cameras
  • False-positive alerts from motion detection are unacceptable
  • You use Home Assistant and want native camera integration
  • You have (or will buy) a Google Coral TPU for efficient AI detection
  • You need both continuous recording and event-based clips
  • Object classification (person vs car vs animal) matters

Choose Motion If…

  • You have a single camera monitoring a controlled space (locked room, closet, server rack)
  • You want the lightest possible resource usage (Raspberry Pi Zero)
  • Any movement — not just specific objects — is what you want to detect
  • You need V4L2 webcam support (USB cameras) rather than IP cameras
  • You’re running on extremely limited hardware where AI detection isn’t feasible
  • You want a wildlife camera that captures any animal movement

Final Verdict

Frigate wins on detection quality because AI object detection solves the fundamental problem with motion detection: false positives. For home security cameras, this alone makes Frigate worth the extra setup complexity and hardware investment. A Coral TPU costs $30 and eliminates the CPU overhead concern.

Motion still makes sense for niche use cases where simplicity and minimal resources matter more than smart detection — single-camera setups, wildlife monitoring, or environments where any movement is genuinely relevant. But those use cases are narrow.

For a middle ground between Frigate’s AI power and Motion’s simplicity, look at Viseron — it offers object detection and face recognition with a more standalone approach than Frigate’s Home Assistant focus.

FAQ

Can I run Frigate without a Coral TPU?

Yes, but CPU-based detection is very resource-intensive. A single 720p camera can use 50–100% of a modern CPU core without hardware acceleration. The Coral USB Accelerator (~$30) processes detections in milliseconds with near-zero CPU impact. It’s the single best hardware investment for a Frigate setup.

Does Motion support AI object detection?

No. Motion strictly uses pixel-change detection. If you want AI features with a Motion-like simplicity, consider Viseron or running Motion as a frontend with a separate AI pipeline.

Can I use both together?

Some users run Motion as a lightweight motion trigger and Frigate as the AI classifier. Motion detects that something changed; Frigate identifies what it was. This reduces Frigate’s processing load but adds complexity. For most home setups, Frigate alone handles both detection and classification efficiently.

Comments