Automated system to backup Azure Storage containers to Google Drive and restore when needed.
- Azure Storage Account:
- Account Name
- Account Key
- Container Name(s)
- Google Drive:
- Shared Drive set up
- OAuth 2.0 Client ID credentials
To create credentials.json
:
-
Go to Google Cloud Console
-
Create a new project or select existing project
-
Enable Google Drive API:
- Go to "APIs & Services" > "Library"
- Search for "Google Drive API"
- Click "Enable"
-
Configure OAuth Consent Screen:
- Go to "APIs & Services" > "OAuth consent screen"
- Select "External" user type
- Fill in the application name and other required fields
- Add scopes:
https://www.googleapis.com/auth/drive.file
andhttps://www.googleapis.com/auth/drive
- Add your email in test users
-
Create OAuth Client ID:
- Go to "APIs & Services" > "Credentials"
- Click "Create Credentials" > "OAuth client ID"
- Select "Desktop application" as application type
- Give it a name
- Click "Create"
-
Download Credentials:
- After creation, click the download button (JSON format)
- Rename the downloaded file to
credentials.json
- Place it in the project root directory
-
Create Azure Storage account or use existing one
-
Get the following information:
- Storage Account Name
- Storage Account Key
- Container Name
-
For restore service (target storage):
- Create another storage account if needed
- Get the same information as above
- Environment setup:
cp .env.example .env
- Configure .env:
# Source Azure (for backup)
AZURE_ACCOUNT_NAME=source_account
AZURE_ACCOUNT_KEY=source_key
AZURE_CONTAINER_NAME=ALL # "ALL" or specific container
# Target Azure (for restore)
TARGET_AZURE_ACCOUNT_NAME=target_account
TARGET_AZURE_ACCOUNT_KEY=target_key
TARGET_AZURE_CONTAINER_NAME=ALL
# Google Drive
GOOGLE_SHARED_DRIVE_ID=your_drive_id
GOOGLE_FOLDER_ID=optional_folder_id
# Backup Schedule (cron format)
BACKUP_SCHEDULE="0 1 * * *" # 1 AM daily
BACKUP_RETENTION_DAYS=7
# Run token generator
docker-compose run --rm token-generator
# Follow the instructions:
# 1. Open provided URL in browser
# 2. Login and authorize the application
# 3. Copy authorization code
# 4. Paste code back in terminal
# Run backup service (follows schedule)
docker-compose up -d backup-service
# Check logs
docker-compose logs -f backup-service
# Latest backup
docker-compose run --rm restore-service
# Specific date
docker-compose run --rm restore-service -date="2023-11-14"
# Check logs
docker-compose logs restore-service
- Incremental backup (only changed files)
- Multiple containers support
- Compression before upload
- Retention policy
- Progress tracking
- Detailed logging
- Automatic cleanup
- Full or specific container restore
- Date-based restore
- Automatic container creation
- Concurrent file processing
- Progress monitoring
- Atomic operations
# Set log level in .env:
LOG_LEVEL=debug # debug, info, warn, error
# View logs:
docker-compose logs -f backup-service
docker-compose logs -f restore-service
- Security:
- Use separate storage accounts for backup/restore
- Rotate access keys regularly
- Secure credentials.json and token.json
- Monitoring:
- Check logs regularly
- Monitor disk space
- Verify backup success
- Testing:
- Test restore process periodically
- Verify file integrity
- Check backup retention
- Token Issues:
# Regenerate token
rm token.json
docker-compose run --rm token-generator
- Azure Issues:
- Verify account credentials
- Check container permissions
- Ensure sufficient quota
- Backup Failures:
- Check source storage access
- Verify sufficient disk space
- Review error logs
MIT