Host Your Own Photo Gallery

Host Your Own Photo Gallery

I got tired of relying on Google Photos and Amazon Photos after they started either nuking free storage or encrypting metadata I couldn't access. So I built my own photo gallery at home using Immich—and honestly, it's been one of the most satisfying self-hosting projects I've tackled. In this guide, I'll walk you through setting up a full-featured, mobile-friendly photo gallery that you control completely.

Why Self-Host Photos?

The cloud services are convenient, sure. But you're trading privacy for ease. Every photo upload gets scanned, indexed, and theoretically available to third parties through legal requests. When I moved my 50,000-photo library to self-hosting, I gained:

You'll need a NAS or spare PC with 2+ CPU cores, 4GB RAM minimum, and enough storage for your library (my 50GB photo collection sits on a 2TB drive). For remote access, I recommend running this behind a reverse proxy like Caddy or Nginx Proxy Manager.

Choosing Your Platform: Immich vs Photoprism

The two solid options in 2026 are Immich and Photoprism. I use Immich because it's actively developed, has excellent mobile apps (iOS and Android), and the server-side search is lightning-fast even with 100K+ photos. Photoprism is more mature if you prioritize stability, but Immich's momentum is undeniable.

For this guide, I'm going with Immich. The setup is nearly identical if you prefer Photoprism—just swap the image names and environment variables.

Hardware & Prerequisites

Here's what I'm running on a refurbished Lenovo M90 Tiny:

This handles 50K photos smoothly. If you're at 200K+ photos, add more RAM or consider a small business NAS like Synology or TrueNAS.

Watch out: SSDs will degrade faster under constant photo library scans. I use an NAS-rated 7200 RPM drive and accept slightly slower search indexing. If budget allows, add a small NVME cache or use a hybrid approach (OS on SSD, photos on HDD).

Install Docker & Docker Compose

First, make sure Docker is running:

curl -fsSL https://get.docker.com -o get-docker.sh
sudo sh get-docker.sh
sudo usermod -aG docker $USER
sudo curl -L "https://github.com/docker/compose/releases/download/v2.24.0/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
sudo chmod +x /usr/local/bin/docker-compose
docker --version && docker-compose --version

Deploy Immich with Docker Compose

I keep my Immich stack in /opt/immich. Create the directory structure and compose file:

sudo mkdir -p /opt/immich/{photos,db,cache,logs}
sudo chown -R $USER:$USER /opt/immich
cd /opt/immich

Now create docker-compose.yml:

version: '3.8'
services:
  immich-server:
    image: ghcr.io/immich-app/immich-server:v1.95.0
    container_name: immich-server
    ports:
      - "3001:3001"
    environment:
      DB_HOSTNAME: immich-db
      DB_USERNAME: immich
      DB_PASSWORD: immichsecurepass123
      DB_NAME: immich
      REDIS_HOSTNAME: immich-redis
      UPLOAD_LOCATION: /usr/src/app/upload
      NODE_ENV: production
      LOG_LEVEL: log
    volumes:
      - /opt/immich/photos:/usr/src/app/upload
      - /opt/immich/logs:/var/log
    depends_on:
      - immich-db
      - immich-redis
    restart: unless-stopped

  immich-microservices:
    image: ghcr.io/immich-app/immich-server:v1.95.0
    container_name: immich-microservices
    command: start.sh immich
    environment:
      DB_HOSTNAME: immich-db
      DB_USERNAME: immich
      DB_PASSWORD: immichsecurepass123
      DB_NAME: immich
      REDIS_HOSTNAME: immich-redis
      UPLOAD_LOCATION: /usr/src/app/upload
      NODE_ENV: production
    volumes:
      - /opt/immich/photos:/usr/src/app/upload
      - /opt/immich/cache:/cache
    depends_on:
      - immich-db
      - immich-redis
    restart: unless-stopped

  immich-db:
    image: postgres:16-alpine
    container_name: immich-db
    environment:
      POSTGRES_USER: immich
      POSTGRES_PASSWORD: immichsecurepass123
      POSTGRES_DB: immich
    volumes:
      - /opt/immich/db:/var/lib/postgresql/data
    restart: unless-stopped

  immich-redis:
    image: redis:7-alpine
    container_name: immich-redis
    restart: unless-stopped

  immich-ml:
    image: ghcr.io/immich-app/immich-machine-learning:v1.95.0
    container_name: immich-ml
    volumes:
      - /opt/immich/cache:/cache
    restart: unless-stopped

Start the stack:

docker-compose up -d
docker-compose logs -f immich-server

Wait 30–60 seconds for migrations to complete. You'll see "Immich is running" in the logs. Check it's working:

curl http://localhost:3001/api/server/version

You should get a JSON response with the server version.

Tip: The database password is hardcoded here for simplicity. In production, use Docker secrets or a `.env` file with proper file permissions.

Reverse Proxy with Caddy

To access Immich remotely (and securely), set up Caddy on the same machine. Install it:

sudo apt install -y debian-keyring debian-archive-keyring apt-transport-https
curl https://dl.filippo.io/mkcert/mkcert-latest-linux-amd64 | sudo tee /usr/local/bin/mkcert > /dev/null
sudo chmod +x /usr/local/bin/mkcert
sudo apt install -y caddy

Create or edit /etc/caddy/Caddyfile:

photos.example.com {
  encode gzip
  reverse_proxy localhost:3001 {
    header_up X-Forwarded-For {http.request.header.X-Forwarded-For}
    header_up X-Forwarded-Proto {http.request.proto}
    header_up X-Forwarded-Host {http.request.host}
  }
  log {
    output file /var/log/caddy/immich.log
    format json
  }
}

Replace photos.example.com with your actual domain. Reload Caddy:

sudo systemctl reload caddy
sudo systemctl status caddy

Caddy automatically issues an HTTPS certificate from Let's Encrypt. Visit https://photos.example.com and you should see the Immich login screen.

Initial Setup & First Upload

On first visit, Immich prompts you to create an admin account. Use a strong password—this is your gateway to all your photos.

I prefer uploading photos via the web UI initially so I can organize them into albums. But the real power comes from the mobile app:

  1. Install Immich from the App Store or Play Store
  2. Enter your server URL (https://photos.example.com) and credentials
  3. Enable "Background Backup" to auto-upload new photos

Within days, my phone was syncing automatically. No Google Photos, no Amazon, just me controlling my own archive.

Backup Your Immich Database

The photo files are safe, but the PostgreSQL database stores metadata, tags, and albums. Back it up weekly:

#!/bin/bash
BACKUP_DIR="/opt/immich/backups"
mkdir -p $BACKUP_DIR
DATE=$(date +%Y%m%d_%H%M%S)
docker exec immich-db pg_dump -U immich immich > $BACKUP_DIR/immich_${DATE}.sql
echo "Backup saved to $BACKUP_DIR/immich_${DATE}.sql"
# Keep only last 30 days
find $BACKUP_DIR -name "immich_*.sql" -mtime +30 -delete

Save this as /opt/immich/backup.sh, make it executable, and add to crontab:

chmod +x /opt/immich/backup.sh
(crontab -l; echo "0 2 * * 0 /opt/immich/backup.sh") | crontab -

This runs every Sunday at 2 AM.

Performance Tuning

If you have 100K+ photos, the ML microservice (face recognition) will run continuously. I throttled it to off-peak hours by editing the microservices section:

immich-microservices:
  ...
  environment:
    ...
    MACHINE_LEARNING_WORKERS: 1
    MICROSERVICES_WORKERS_PROCESS: 1
  ...

This uses less CPU but takes longer to process. With 50K photos and 1 worker, indexing took about a week running 6 hours nightly. Acceptable trade-off for a home setup.

Storage Scaling

Your photo library will grow. I started planning for 5TB from day one. On a budget, I recommend:

I use NFS because it handles snapshots natively, giving me point-in-time backups if a photo gets accidentally deleted.

Next Steps

You now have a working, private photo gallery. Next, I'd recommend:

  1. Enable TOTP 2FA in Immich settings (admin → users)
  2. Set up off-site backup of both photos and database to a cheap VPS or S3-compatible storage
  3. Configure WireGuard or Tailscale if you travel frequently—no need to expose Caddy to the internet

If you're running this on a shared VPS (not recommended for photos, but possible), consider RackNerd's KVM VPS plans—they're affordable and allow Docker without throttling.

Your photos are now yours. No ads, no scanning, no surprise policy changes. Just you and your memories, on hardware you control.

Discussion