Goal
AWS glacier offers a low cost solution for archiving files. Here is how I implementing it to backup my server's files.
Approach with Docker, first try
I try to have a server at home where every services are dockerized. The only tools that I still use on the host are:
- midnight commander : not that I really need it but it reminds me of my dad
- ssh : to have a remote access to the host
- crontab : in order to schedule
docker run
commands
So it's important for me to have a dockerized solution. Hopefully the AWS client is integrated into the elasticms docker image.
Here is my first try:
docker run --rm -it -v /home/theus/docker/aws:/home/default/.aws:ro -v /mnt/backups/theus:/data elasticms/admin aws s3 sync --storage-class GLACIER /data s3://septune
The initital upload cost me, for about 240GB, 51$ and the monthly cost for that volume of data is about 0.50$/month which is nice.
Optimization
Obviouly in my case, it's useless
- to keep deleted files
- to activate the interactif mode
docker run --rm -v /home/theus/docker/aws:/home/default/.aws:ro -v /mnt/backups/theus:/data elasticms/admin aws s3 sync --delete --storage-class GLACIER /data s3://septune
Schedule it
Here is my crontab command (crontab -e
):
0 6 * * * docker run --rm -v /home/theus/docker/aws:/home/default/.aws:ro -v /mnt/backups/theus:/data elasticms/admin aws s3 sync --delete --storage-class GLACIER /data s3://septune 2>>~/cron_error.log
Tips
If it's easy to save my docker context I also have to backup the rest of my configuration
Backup my router config
In your router it's possible to backup the config. You should have somewhere a download config link.
But, and I'm probably not the only one, I usually forgot to backup my config when I'm updating it. So when I need it the last version that I have is obsolete.
If you open your browser's debug toolbar (at least for Chrome), just before clicking on the backup config link, you'll be able to save the request in a curl format: Right click > copy > copy as cURL.
You chould have something like in your clipboard
curl 'http://192.168.0.1/nvrambak.bin' \
-H 'Connection: keep-alive' \
-H 'Pragma: no-cache' \
-H 'Cache-Control: no-cache' \
-H 'Authorization: Basic dZZZZZZZZZZ3bDEzNzc=' \
--compressed \
--insecure \
--output /mnt/backups/theus/dumps/save.bin
Based on that you can develop this little bash script:
#/bin/bash
curl 'http://192.168.0.1/nvrambak.bin' \
-H 'Connection: keep-alive' \
-H 'Pragma: no-cache' \
-H 'Cache-Control: no-cache' \
-H 'Authorization: Basic dGhldXM6ZmV3bDEzNzc=' \
--compressed \
--insecure \
--output /mnt/backups/theus/dumps/router_ddwrt_$(date +%u).bin
Up to you to schedule it periodically. An example with crontab:
18 * * * * ~/.local/bin/router_backup.sh 2>>~/router_backup_error.log
Backup my crontab config
In case of a disaster I won't have to reinvent my crontab config:
crontab -l > /mnt/backups/theus/crontab.bak
The same in my crontab config :
0 5 * * * crontab -l > /mnt/backups/theus/crontab.bak 2>>~/cron_error.log
Last posts
- Draw.io on a website
- Deploy elasticms on AWS
- Intégrer BOSA Accessibility Check dans un site web [Content in French]
- PHP - Convert Human Readable Size to Bytes
- Composer: How to use a specific branch acting like a specific revision in composer dependencies
- Stream a CSV from a JSON array
- Comment utiliser les commandes "locales" du skeleton [Content in French]
- How to extract data from a JsonMenuNestedEditorField
- Backup on AWS glacier
- Refer to environment variables in twig