replicate aws s3 to another s3

to replicate one s3 drive to another, one of the easiest and best way is to use s3md command.

To instal this command use –
apt-get install s3cmd

after installation, run the following command to setup s3cmd. this will also ask for your access and secret keys
s3cmd –configure

then run the following command to create a new backup bucket and replicate from existing one.
s3cmd mb s3://mybucket_backup
s3cmd –recursive cp s3://mybucket s3://mybucket_backup

Advertisements

Azure File Service on linux

You can now using azure file services to make large drives and mount them to linux server. When i was testing, it was available for many windows versions and very few version of linux.

Linux VMs deployed on Azure can make use of this service using the Linux Kernel CIFS client. The kernel client must be configured to support and use the SMB 2.1 protocol dialect:

CONFIG_CIFS_SMB2 must be enabled in the kernel configuration at build time
Use
# zcat /proc/config.gz | grep CONFIG_CIFS_SMB2
to check this on a running system.
The vers=2.1 mount.cifs parameter must be provided at mount time.
Furthermore, the Azure storage account and access key must be provided as username and password.

# mount.cifs -o vers=2.1,user=smb //smb.file.core.windows.net/share /share/
Password for smb@//smb.file.core.windows.net/share: ******…
# df -h /share/
Filesystem Size Used Avail Use% Mounted on
//smb.file.core.windows.net/share 5.0T 0 5.0T 0% /share

Move linux logs to AWS S3

One of the best and cheapest place i have found for backing up data is AWS S3. Its cheap, reasonalbly fast and easy to manage using command line, shell scripts and powershell.

Below we see the steps to move daily generated logs to AWS S3. We will be using S3cmd, linux based s3 client to copy data to S3.

To install S3cmd use the steps.

  1. As a superuser (root) go to /etc/yum.repos.d
  2. Download s3tools.repo file for your distribution. Links to these .repo files are in the table above. For instance  wget http://s3tools.org/repo/RHEL_6/s3tools.repo  , if you’re on CentOS 6.x
  3. Run yum install s3cmd.

then run s3cmd –configure to add your accesskey and security key for the s3 bucket

next, copy all the log files using the following syntax

s3cmd put log* s3://prod-logs/

here ‘log’ is the prefix for the log files and prod-logs is my bucket name.

Next if you want to remove the logs moved to S3 bucket using the below

rm -rf log*

 

If you want to change this to a batch script to move files daily to S3 bucket, follow the steps as per below-

Write a Shell Script to do copy/move to s3 and set permissions

#!/bin/sh

s3cmd put log* s3://prod-logs/

rm -rf log*

Make the script an Executable

$ chmod +x ScriptName.sh

Run at mid night  every day by adding to Cron Job

$ crontab -e

59 23 * * * /ScriptName.sh

 

Save cronjob

 

Above will move all your daily logs to S3.