Move linux logs to AWS S3

One of the best and cheapest place i have found for backing up data is AWS S3. Its cheap, reasonalbly fast and easy to manage using command line, shell scripts and powershell.

Below we see the steps to move daily generated logs to AWS S3. We will be using S3cmd, linux based s3 client to copy data to S3.

To install S3cmd use the steps.

  1. As a superuser (root) go to /etc/yum.repos.d
  2. Download s3tools.repo file for your distribution. Links to these .repo files are in the table above. For instance  wget  , if you’re on CentOS 6.x
  3. Run yum install s3cmd.

then run s3cmd –configure to add your accesskey and security key for the s3 bucket

next, copy all the log files using the following syntax

s3cmd put log* s3://prod-logs/

here ‘log’ is the prefix for the log files and prod-logs is my bucket name.

Next if you want to remove the logs moved to S3 bucket using the below

rm -rf log*


If you want to change this to a batch script to move files daily to S3 bucket, follow the steps as per below-

Write a Shell Script to do copy/move to s3 and set permissions


s3cmd put log* s3://prod-logs/

rm -rf log*

Make the script an Executable

$ chmod +x

Run at mid night  every day by adding to Cron Job

$ crontab -e

59 23 * * * /


Save cronjob


Above will move all your daily logs to S3.