Self-Host Nextcloud for Files
I got tired of paying $2–12 per month for cloud storage subscriptions, so I built my own Nextcloud instance last year. Now I store 400GB of documents, photos, and backups without touching anyone else's infrastructure. If you want to own your files instead of renting them, this is the move.
In this guide, I'll walk you through deploying Nextcloud on a Linux server using Docker and Compose, securing it with Caddy as a reverse proxy, and configuring basic user management. You can run this on a RackNerd VPS, a Raspberry Pi 5, or even an old laptop in your closet.
Why Self-Host Nextcloud?
The big wins for me were control and cost. I don't worry about corporate policy changes, data privacy regulations, or surprise price hikes. I have direct access to my files through their API. I can integrate Nextcloud with other apps in my homelab—Jellyfin pulls video metadata, my photo library syncs to multiple devices, and desktop clients keep everything in sync like Dropbox would.
The downside is that you own the uptime and backups. If your server goes down, your files aren't accessible. If you lose the disk, they're gone. I mitigate this with weekly offsite backups to a separate VPS, but that's beyond the scope of today.
Prerequisites and Hardware
You'll need a Linux server (Ubuntu 22.04 LTS is my baseline), Docker and Docker Compose installed, a domain name, and 5 minutes to set DNS records. I prefer running Nextcloud on a RackNerd KVM VPS because their pricing is under $15/year for enough performance, but bare metal or a home server works fine too.
I use Caddy as the reverse proxy because its automatic HTTPS and simple config file beat Nginx for small deployments. If you prefer Nginx Proxy Manager, the Nextcloud setup stays identical—only the proxy changes.
Disk size depends on your files. I allocated 500GB for my instance and I'm at 70% capacity. Start with what you think you'll need, then add more later.
Docker Compose Setup
I keep everything in a single compose file. This includes the Nextcloud app, a MariaDB database, and a Redis cache. Redis improves responsiveness when syncing large file counts.
mkdir -p ~/nextcloud && cd ~/nextcloud
nano docker-compose.yml
Paste this into the file:
version: '3.8'
services:
db:
image: mariadb:latest
container_name: nextcloud-db
restart: always
environment:
MYSQL_ROOT_PASSWORD: your_strong_root_password
MYSQL_DATABASE: nextcloud
MYSQL_USER: nextcloud
MYSQL_PASSWORD: your_nextcloud_db_password
volumes:
- ./db:/var/lib/mysql
networks:
- nextcloud-net
healthcheck:
test: ["CMD", "mysqladmin", "ping", "-h", "localhost"]
interval: 10s
timeout: 5s
retries: 5
redis:
image: redis:latest
container_name: nextcloud-redis
restart: always
networks:
- nextcloud-net
command: redis-server --appendonly yes
volumes:
- ./redis:/data
nextcloud:
image: nextcloud:latest
container_name: nextcloud
restart: always
depends_on:
db:
condition: service_healthy
environment:
MYSQL_DATABASE: nextcloud
MYSQL_USER: nextcloud
MYSQL_PASSWORD: your_nextcloud_db_password
MYSQL_HOST: db
REDIS_HOST: redis
NEXTCLOUD_TRUSTED_DOMAINS: "files.yourdomain.com"
OVERWRITEPROTOCOL: https
OVERWRITEHOST: "files.yourdomain.com"
volumes:
- ./data:/var/www/html/data
- ./config:/var/www/html/config
- ./apps:/var/www/html/custom_apps
networks:
- nextcloud-net
expose:
- 80
networks:
nextcloud-net:
driver: bridge
Replace the passwords and domain with your own. Create the directories first:
mkdir -p db data config apps redis
docker compose up -d
Wait for the containers to start. Check logs with:
docker compose logs -f nextcloud
chmod 600 docker-compose.yml.Securing with Caddy
Nextcloud must run over HTTPS. Caddy automatically fetches and renews Let's Encrypt certificates, so there's no manual cert management.
Create a Caddyfile in the same directory:
nano Caddyfile
files.yourdomain.com {
reverse_proxy localhost:80 {
header_up Host {upstream_hostport}
header_up X-Real-IP {remote_host}
header_up X-Forwarded-For {remote_host}
header_up X-Forwarded-Proto https
}
# Nextcloud header requirements
header {
Strict-Transport-Security "max-age=31536000; includeSubDomains"
X-Content-Type-Options nosniff
X-Frame-Options "SAMEORIGIN"
X-XSS-Protection "1; mode=block"
Referrer-Policy "no-referrer"
}
}
Replace files.yourdomain.com with your actual domain. Make sure your DNS A record points to your server's IP address before starting Caddy.
Add Caddy to your compose file by updating it:
caddy:
image: caddy:latest
container_name: nextcloud-caddy
restart: always
ports:
- "80:80"
- "443:443"
volumes:
- ./Caddyfile:/etc/caddy/Caddyfile:ro
- ./caddy_data:/data
- ./caddy_config:/config
networks:
- nextcloud-net
Then restart:
docker compose up -d
Visit https://files.yourdomain.com in your browser. Caddy will provision the certificate automatically. You should see the Nextcloud setup wizard.
Initial Configuration and Users
The wizard asks for an admin username and password. Use something strong—this is the master account. Create a dedicated database user (which we already did in the compose file), and accept the defaults for everything else.
Once logged in, go to Settings → Administration → Users to create user accounts for family members or team members. Each user gets their own 500MB limit by default, which you can adjust per-user or globally under Settings → Quotas.
I recommend enabling Two-Factor Authentication for the admin account immediately. Go to Settings → Security and toggle TOTP or WebAuthn.
Install the desktop sync client on your machines (nextcloud-client in Ubuntu repos, available on Mac and Windows too). This keeps your files in sync like Dropbox, without the subscription.
Performance Tuning
Out of the box, Nextcloud is slow if you have thousands of files. I made three changes that doubled sync speed:
1. Enable caching properly. The Redis container in the compose file is ready to go, but Nextcloud doesn't use it yet. SSH into the container and edit the config:
docker exec -it nextcloud bash
cd /var/www/html/config
nano config.php
Find the 'memcache.local' section and add these lines at the end of the array (before the closing semicolon):
'memcache.distributed' => '\OC\Memcache\Redis',
'redis' => [
'host' => 'redis',
'port' => 6379,
],
2. Increase upload limits. By default, Nextcloud limits uploads to 512MB. Add this to config.php inside the array:
'upload_max_filesize' => '10G',
'post_max_size' => '10G',
'max_input_time' => 3600,
'max_execution_time' => 3600,
3. Enable JPEG preview generation in the background.** Under Settings → Administration → Background Jobs, switch from AJAX to Cron. This prevents the web process from blocking during preview generation.
Accessing Files Outside Your Network
Once Caddy is running with a valid certificate, you can access your files from anywhere. The web UI works on phones and computers. The desktop and mobile clients (available in app stores) provide true sync.
If you want extra security, I recommend locking down access by IP or using fail2ban to rate-limit brute-force login attempts. See our previous guide on hardening with fail2ban for details.
For sensitive data, consider running Nextcloud on a private network behind Tailscale or WireGuard instead of exposing it directly to the internet. That requires additional setup but gives you more control over who can reach it.
Backups and Maintenance
Create a backup script that runs weekly. Store backups offsite (another VPS, AWS S3, or a friend's server):
#!/bin/bash
cd ~/nextcloud
docker compose exec -T db mysqldump -u nextcloud -pyour_nextcloud_db_password nextcloud > nextcloud-$(date +%Y%m%d).sql
tar -czf nextcloud-data-$(date +%Y%m%d).tar.gz data/
# Upload to remote storage
scp nextcloud-*.* user@backup-server:/backups/
Schedule it in cron: 0 3 * * 0 /home/user/nextcloud-backup.sh (runs every Sunday at 3 AM).
Keep Nextcloud updated by pulling the latest image regularly:
cd ~/nextcloud
docker compose pull nextcloud
docker compose up -d
Next Steps
With Nextcloud running, you've got the foundation for a private cloud. From here, you can integrate it with other tools—connect it to Jellyfin for photo management, add Collabora Online for real-time document editing, or link it to a home automation hub for file triggers.
If you're running this on a budget, a RackNerd VPS gives you enough horsepower for a few users without breaking the bank. Start small, monitor resource usage, and scale the disk as your data grows.
The real win is owning your data. No more wondering where your files live or what a company's privacy policy really means. It's yours, encrypted at rest, under your control.
Discussion