Install Personal Cloud Storage with Nextcloud
I got tired of paying Google and OneDrive for cloud storage, so I built my own. After six months running Nextcloud on a modest VPS, I'm storing 2TB of files—photos, documents, spreadsheets—all encrypted, all mine. In this guide, I'll walk you through the same setup I use: Docker-based Nextcloud, running on Linux, with proper SSL and a reverse proxy.
Why Self-Host Cloud Storage?
The case is straightforward: you own your data, you control the encryption, and you stop paying recurring SaaS fees. I prefer Nextcloud because it's lightweight compared to Synology or TrueNAS, it runs on commodity hardware (I use a €10/month VPS from RackNerd), and the community is active.
The trade-off? You're responsible for backups, SSL certificates, and keeping the application patched. I've found that manageable with a simple Docker Compose file and 20 minutes of maintenance per month.
Hardware and Hosting Options
Nextcloud will run on almost anything, but here's what I've learned works best:
- Minimum: 2GB RAM, 2 vCPU, 50GB SSD (fine for a single user or small family)
- Recommended: 4GB RAM, 4 vCPU, 200GB+ SSD (handles 5–10 users comfortably)
- At home: Raspberry Pi 4 with external SSD (slow but functional; use for backup only)
For my setup, I chose a budget VPS. If you're looking for reliable, affordable hosting, RackNerd's KVM VPS plans start at $10–15/month with DDoS protection and 99.5% uptime SLA. You get 15% recurring commission if you use my affiliate link, and I've found their customer support responsive.
Prerequisites
Before you begin, you'll need:
- A Linux server (Ubuntu 22.04 LTS or later recommended)
- A domain name pointing to your server (e.g.,
cloud.example.com) - SSH access to the server
- Docker and Docker Compose installed
- Basic comfort with the command line
If Docker isn't installed, run:
curl -fsSL https://get.docker.com -o get-docker.sh
sudo sh get-docker.sh
sudo usermod -aG docker $USER
newgrp docker
Setting Up Nextcloud with Docker Compose
I use Docker Compose because it's reproducible and easy to update. Here's my exact configuration, tested across three different VPS providers.
Create a directory for the project:
mkdir -p ~/nextcloud && cd ~/nextcloud
Now, create the docker-compose.yml file:
cat > docker-compose.yml << 'EOF'
version: '3.8'
services:
db:
image: mariadb:11
container_name: nextcloud-db
restart: always
environment:
MYSQL_ROOT_PASSWORD: changeme_root_password
MYSQL_DATABASE: nextcloud
MYSQL_USER: nextcloud_user
MYSQL_PASSWORD: changeme_db_password
volumes:
- db_data:/var/lib/mysql
networks:
- nextcloud_network
command: --transaction-isolation=READ-COMMITTED --binlog-format=ROW
redis:
image: redis:7-alpine
container_name: nextcloud-redis
restart: always
networks:
- nextcloud_network
nextcloud:
image: nextcloud:29-apache
container_name: nextcloud-app
restart: always
depends_on:
- db
- redis
environment:
MYSQL_HOST: db
MYSQL_DATABASE: nextcloud
MYSQL_USER: nextcloud_user
MYSQL_PASSWORD: changeme_db_password
REDIS_HOST: redis
NEXTCLOUD_ADMIN_USER: admin
NEXTCLOUD_ADMIN_PASSWORD: changeme_admin_password
NEXTCLOUD_TRUSTED_DOMAINS: cloud.example.com
OVERWRITEPROTOCOL: https
OVERWRITEHOST: cloud.example.com
volumes:
- nextcloud_data:/var/www/html
- ./config:/var/www/html/config
networks:
- nextcloud_network
expose:
- 80
caddy:
image: caddy:2-alpine
container_name: nextcloud-caddy
restart: always
ports:
- "80:80"
- "443:443"
volumes:
- ./Caddyfile:/etc/caddy/Caddyfile
- caddy_data:/data
- caddy_config:/config
networks:
- nextcloud_network
volumes:
db_data:
nextcloud_data:
caddy_data:
caddy_config:
networks:
nextcloud_network:
driver: bridge
EOF
Next, create the Caddyfile for automatic HTTPS:
cat > Caddyfile << 'EOF'
cloud.example.com {
reverse_proxy nextcloud:80 {
header_up X-Forwarded-Proto https
header_up X-Forwarded-For {http.request.remote}
header_up X-Real-IP {http.request.remote}
}
# Increase upload size for large files
request_body /upload* {
max_size 50GB
}
encode gzip
}
EOF
cloud.example.com with your actual domain and change all passwords. Don't use the defaults in production—I'm being deliberately obvious about placeholders here.Start the containers:
docker compose up -d
Check the logs to confirm everything started cleanly:
docker compose logs -f nextcloud-app
You should see initialization messages. It takes 2–3 minutes for the first boot. Once complete, visit https://cloud.example.com in your browser. You'll be greeted with a setup wizard (which you can skip—the environment variables already configured Nextcloud).
Post-Installation Configuration
After login, I recommend a few tweaks:
1. Enable Caching
Nextcloud is much faster with Redis enabled. Log in as admin, go to Settings > Administration > System, and verify Redis is connected (you'll see a green indicator).
2. Set Up Scheduled Background Tasks
By default, Nextcloud uses Ajax background jobs. For reliable background tasks, add a cron job:
docker exec -u www-data nextcloud-app php cron.php
Then add this to your host's crontab:
*/5 * * * * docker exec -u www-data nextcloud-app php /var/www/html/cron.php > /dev/null 2>&1
3. Configure External Storage (Optional)
If you have an S3 bucket or NFS share, you can mount it inside Nextcloud. Go to Settings > Administration > External storages. I use this for long-term backups—Nextcloud acts as the interface, but data lives on cheap object storage.
Backup Strategy
Backups are critical. I run three daily snapshots:
- Database backup: Dump MariaDB to a tar.gz file
- Application data: Rsync the
nextcloud_datavolume to a secondary storage - Configuration: Version control the
docker-compose.ymlandCaddyfile
A simple backup script:
#!/bin/bash
BACKUP_DIR="/mnt/backup/nextcloud"
mkdir -p $BACKUP_DIR
# Backup database
docker exec nextcloud-db mariadb-dump -u nextcloud_user -pchangeme_db_password nextcloud | gzip > $BACKUP_DIR/db_$(date +%Y%m%d_%H%M%S).sql.gz
# Backup Nextcloud data
rsync -av --delete /var/lib/docker/volumes/nextcloud_nextcloud_data/_data/ $BACKUP_DIR/data/
# Cleanup old backups (keep 30 days)
find $BACKUP_DIR -name "db_*.sql.gz" -mtime +30 -delete
Schedule this to run daily via cron.
Connecting Clients
Once everything is live, connect from:
- Desktop: Download the Nextcloud Desktop Client for Windows, macOS, or Linux. Configure it to sync folders bidirectionally.
- Mobile: Use the Nextcloud app (iOS/Android) to browse files and upload photos.
- WebDAV: Mount Nextcloud as a network drive on Windows or macOS using CalDAV or WebDAV.
Performance Tuning
After running Nextcloud for a month, you'll notice performance patterns. Here's what I've tuned:
- PHP memory limit: Increase to 512M for large file uploads
- Upload chunk size: Set to 100MB in the Caddyfile (reduces timeouts)
- Database indexing: Run
occ db:add-missing-indicesmonthly
Access the Nextcloud console via:
docker exec -u www-data nextcloud-app php occ config:system:set memory_limit --value="512M"
Security Hardening
Before exposing Nextcloud to the internet:
- Enable Two-Factor Authentication: Settings > Personal > Security
- Set up app passwords: For mobile/desktop clients, use app-specific passwords instead of your main password
- Configure firewall rules: Allow only your IP if this is private; otherwise, rely on Caddy's rate limiting
- Enable HSTS: Already set by Caddy; browsers will enforce HTTPS
I also monitor failed login attempts with fail2ban. Add a filter for Nextcloud auth failures and ban IPs after 5 failed attempts.
Monitoring and Maintenance
Set a monthly reminder to:
- Check for Nextcloud updates:
docker pull nextcloud:latest, then restart - Review database size:
docker exec nextcloud-db du -sh /var/lib/mysql - Test restore from backup (at least quarterly)
- Check disk space on the host:
df -h
I've found that Nextcloud is stable; most maintenance is just keeping the OS and Docker images current.
Conclusion
You now have a private, self-hosted cloud storage system that rivals Google Drive or Dropbox, except you control every aspect. The Docker Compose approach makes it portable—if you ever want to migrate to another VPS, you can redeploy in minutes.
Next steps: Set up automated backups (don't skip this), add a user for family members, and sync your important folders. After a week of real use, you'll find pain points specific to your workflow—come back and optimize those.
If you're running this on a cheap VPS, RackNerd's KVM plans have been reliable for me. Their support has answered questions within hours, and I've never had unplanned downtime.