replicate aws s3 to another s3

to replicate one s3 drive to another, one of the easiest and best way is to use s3md command.

To instal this command use –
apt-get install s3cmd

after installation, run the following command to setup s3cmd. this will also ask for your access and secret keys
s3cmd –configure

then run the following command to create a new backup bucket and replicate from existing one.
s3cmd mb s3://mybucket_backup
s3cmd –recursive cp s3://mybucket s3://mybucket_backup

Move linux logs to AWS S3

One of the best and cheapest place i have found for backing up data is AWS S3. Its cheap, reasonalbly fast and easy to manage using command line, shell scripts and powershell.

Below we see the steps to move daily generated logs to AWS S3. We will be using S3cmd, linux based s3 client to copy data to S3.

To install S3cmd use the steps.

  1. As a superuser (root) go to /etc/yum.repos.d
  2. Download s3tools.repo file for your distribution. Links to these .repo files are in the table above. For instance  wget http://s3tools.org/repo/RHEL_6/s3tools.repo  , if you’re on CentOS 6.x
  3. Run yum install s3cmd.

then run s3cmd –configure to add your accesskey and security key for the s3 bucket

next, copy all the log files using the following syntax

s3cmd put log* s3://prod-logs/

here ‘log’ is the prefix for the log files and prod-logs is my bucket name.

Next if you want to remove the logs moved to S3 bucket using the below

rm -rf log*

 

If you want to change this to a batch script to move files daily to S3 bucket, follow the steps as per below-

Write a Shell Script to do copy/move to s3 and set permissions

#!/bin/sh

s3cmd put log* s3://prod-logs/

rm -rf log*

Make the script an Executable

$ chmod +x ScriptName.sh

Run at mid night  every day by adding to Cron Job

$ crontab -e

59 23 * * * /ScriptName.sh

 

Save cronjob

 

Above will move all your daily logs to S3.