One of the best and cheapest place i have found for backing up data is AWS S3. Its cheap, reasonalbly fast and easy to manage using command line, shell scripts and powershell.
Below we see the steps to move daily generated logs to AWS S3. We will be using S3cmd, linux based s3 client to copy data to S3.
To install S3cmd use the steps.
- As a superuser (root) go to /etc/yum.repos.d
- Download s3tools.repo file for your distribution. Links to these .repo files are in the table above. For instance wget http://s3tools.org/repo/RHEL_6/s3tools.repo , if you’re on CentOS 6.x
- Run yum install s3cmd.
then run s3cmd –configure to add your accesskey and security key for the s3 bucket
next, copy all the log files using the following syntax
s3cmd put log* s3://prod-logs/
here ‘log’ is the prefix for the log files and prod-logs is my bucket name.
Next if you want to remove the logs moved to S3 bucket using the below
rm -rf log*
If you want to change this to a batch script to move files daily to S3 bucket, follow the steps as per below-
Write a Shell Script to do copy/move to s3 and set permissions
#!/bin/sh
s3cmd put log* s3://prod-logs/
rm -rf log*
Make the script an Executable
$ chmod +x ScriptName.sh
Run at mid night every day by adding to Cron Job
$ crontab -e
59 23 * * * /ScriptName.sh
Save cronjob
Above will move all your daily logs to S3.