Deploying a Web Deploy Package to AWS ElasticBeanstalk

AWS provide an extension to Visual Studio to make interacting with your AWS services easy, including deploying to a Beanstalk environment, which is the recommended way of deploying to a Beanstalk.

This works great, and if you are able to, you should obviously use the recommended approach, but there may be times you don’t have the extension available, or already have a build system setup to use Web Deploy packages. As far as I can tell the Beanstalk just uses MsDeploy packages under the hood, making it easy to deploy these without the extension!

1. Create a Package

If you don’t already have a Web Deploy package, create one. This is simple in Visual Studio, open your Web Application, right click on the Web Application project and select Publish.

This will open the Publish Web dialog:

Deploy1

Select Custom, and give your profile a name (for example the beanstalk environment name). On the Connection screen, update the Publish method to Web Deploy Package.

Deploy2Enter Default Web Site for the Site name, and choose a location on your machine to create the package.

Check the Settings are correct on the next screen, confirm your publish location on the Preview screen, and then click Publish.

Navigate to the folder where the package was created, you should see 5 files, the only one of interest for this case is the ZIP file.

 

2. Deploying the package

Browse to your environment in the AWS Management Console

Deploy3Select Upload and Deploy.

 

Deploy4Choose the ZIP file created earlier, and give this version a label (these should be unique amongst labels used for this application).

Clicking Deploy will start the deployment of the code to this environment, you will be able to monitor the logs in the Console to the status of your deployment.

If everything goes to plan you should see a message in the logs saying “New application version was deployed to running EC2 instances.”.

Next Steps

Just as you can automate these steps using the AWS Visual Studio Toolkit and the Deployment Tool command line program, these steps can be automated

The package can be created using MSBuild and the Package target.

The deployment to AWS can be automated using either the CLI tools, or Powershell tools and the following methods

Minimum configuration to use codedeploy with .NET MVC in AWS

trying codedeploy for the first time in AWS and looking for a minimum to get going? Using code deploy in aws needs 3 minimum steps

  1. install codedeploy agent on targets instances
  2. setup appconfig.yml file to direct how codedeploy will work, where it will deploy using agents etc
  3. setup codedeploy on console to trigger push to target.

to setup codedeploy agent on target instance where you wish to deploy code, please use following commands (for windows server as we are targetting mvc over IIS web server)

Set-ExecutionPolicy RemoteSigned
Import-Module AWSPowerShell

PS C:\Users\Administrator> powershell.exe -Command Set-AWSCredentials 
-AccessKey 'xxxxx' -SecretKey 'xxxx' Initialize-AWSDefaults

(there are better ways to do this but just going with this for demo)
New-Item –Path "c:\temp" –ItemType "directory" -Force
powershell.exe -Command Read-S3Object -BucketName bucket-name 
-Key latest/codedeploy-agent.msi -File c:\temp\codedeploy-agent.msi
-AccessKey 'xxxxx' -SecretKey 'xxxxxx'
c:\temp\codedeploy-agent.msi /quiet /l c:\temp\host-agent-install-log.txt
powershell.exe -Command Get-Service -Name codedeployagent

bucket-name represents one of the following:

  • aws-codedeploy-us-east-1 for instances in the US East (N. Virginia) region
  • aws-codedeploy-us-east-2 for instances in the US East (Ohio) region
  • aws-codedeploy-us-west-1 for instances in the US West (N. California) region
  • aws-codedeploy-us-west-2 for instances in the US West (Oregon) region
  • aws-codedeploy-ca-central-1 for instances in the Canada (Central) region
  • aws-codedeploy-eu-west-1 for instances in the EU (Ireland) region
  • aws-codedeploy-eu-west-2 for instances in the EU (London) region
  • aws-codedeploy-eu-central-1 for instances in the EU (Frankfurt) region
  • aws-codedeploy-ap-northeast-1 for instances in the Asia Pacific (Tokyo) region
  • aws-codedeploy-ap-northeast-2 for instances in the Asia Pacific (Seoul) region
  • aws-codedeploy-ap-southeast-1 for instances in the Asia Pacific (Singapore) region
  • aws-codedeploy-ap-southeast-2 for instances in the Asia Pacific (Sydney) region
  • aws-codedeploy-ap-south-1 for instances in the Asia Pacific (Mumbai) region
  • aws-codedeploy-sa-east-1 for instances in the South America (São Paulo) region

If the AWS CodeDeploy agent is installed and running, after the Get-Service command call, you should see output similar to the following:


Status   Name                DisplayName
------   ----                -----------
Running  codedeployagent    CodeDeploy Host Agent Service

use the below code to make a simple appconfig.yml file and place it in the root if the zip file along with entire publish code. Make sure you published your code using file system option and not the web deploy that is used for beanstalk deployments.

just upload this zip to s3 and use codedeploy to push the file to target.

version: 0.0
os: windows
files:
– source: \
destination: c:\inetpub\wwwroot\

my final zip file structure looked like this and it worked successfully.

appconfig

 

replicate aws s3 to another s3

to replicate one s3 drive to another, one of the easiest and best way is to use s3md command.

To instal this command use –
apt-get install s3cmd

after installation, run the following command to setup s3cmd. this will also ask for your access and secret keys
s3cmd –configure

then run the following command to create a new backup bucket and replicate from existing one.
s3cmd mb s3://mybucket_backup
s3cmd –recursive cp s3://mybucket s3://mybucket_backup

azure sql backup to azure storage

Use the following script to backup azure sql to storage.
replace database details and storage details in the scripts below.
Also you would need to import your azure publishing file to run this script.
refer to this article on how to azure publishing file.

https://devopsandcloud.wordpress.com/2017/01/21/download-and-import-publish-settings-and-subscription-information/

 

# Check if Windows Azure Powershell is avaiable
try{
Import-Module Azure -ErrorAction Stop
}catch{
throw “Windows Azure Powershell not found! Please make sure to install them from http://www.windowsazure.com/en-us/downloads/#cmd-line-tools”
}

Import-AzurePublishSettingsFile “C:\jenkinsjobs\Pay-As-You-Go.publishsettings” #replace with your publishing file path

$DatabaseServerName=”azureservername.database.windows.net”
$DatabaseName= “DBName”
$DatabasePassword=”azure sql password”
$DatabaseUsername=”azure sql user”
$StorageName=”storage name”
$StorageKey=”storage key”
$StorageContainerName=”containername”
$dateTime = get-date -Format u
$blobName = “$DatabaseName.$dateTime.bacpac”
Write-Host “Using blobName: $blobName”

# Create Database Connection
$securedPassword = ConvertTo-SecureString -String $DatabasePassword -asPlainText -Force
$serverCredential = new-object System.Management.Automation.PSCredential($DatabaseUsername, $securedPassword)
$databaseContext = New-AzureSqlDatabaseServerContext -FullyQualifiedServerName $DatabaseServerName -Credential $serverCredential

# Create Storage Connection
$storageContext = New-AzureStorageContext -StorageAccountName $StorageName -StorageAccountKey $StorageKey

# Initiate the Export
$operationStatus = Start-AzureSqlDatabaseExport -StorageContext $storageContext -SqlConnectionContext $databaseContext -BlobName $blobName -DatabaseName $DatabaseName -StorageContainerName $StorageContainerName

# Wait for the operation to finish
do{
if ($operationStatus)
{
$status = Get-AzureSqlDatabaseImportExportStatus -Request $operationStatus
if ($status){
Start-Sleep -s 3
$progress =$status.Status.ToString()
Write-Host “Waiting for database export completion. Operation status: $progress”
}
else
{
Write-Host “Null Status. Awating updates.”
}
}
}until ($status.Status -eq “Completed”)
Write-Host “Database export is complete”

download and import publish settings and subscription information

Run Windows PowerShell as an administrator

Choose Start, in the Search box, type Windows Powershell.

Right-click the Windows PowerShell link, and then choose Run as administrator.

At the Windows PowerShell command prompt, type the following command, and then press Enter.

Get-AzurePublishSettingsFile

A web browser opens at https://windows.azure.com/download/publishprofile.aspx for signing in to Windows Azure.
Sign in to the Windows Azure Management Portal, and then follow the instructions to download your Windows Azure publishing settings. Save the file as a .publishsettings type file to your computer.

Note of the file name and location

In the Windows Azure PowerShell window, at the command prompt, type the following command, and then press Enter.

Import-AzurePublishSettingsFile <mysettings>.publishsettings

Replace <mysettings> with the file name of the publishsettings file that you downloaded in the previous step.

backup azure storage

I always keep a backup of my azure storage so that in case code deletes something by mistake in azure storage, i have the backup ready to get the file from. Even though azure replicates storage, but it is not fail proof in case of manual deletion and the replica will remove the blob too.

I use AZCOPY to move my data from storage to storage or storage to azure file. This is then run as a 6 hourly job to sync data with storage, giving me enough time to get copy from manual replica in case I delete something.

to download azcopy, go to this link – http://aka.ms/downloadazcopy

then use this powershell script to run AZcopy. (replace with your source and destination storage)

$theSource = @{path=”; accessKey=”; recursion=”; pattern=”}
$theDestination = @{path=”; accessKey=”}

$theSource.path = ‘/Source:https://STORAGENAME.blob.core.windows.net/CONTAINERNAME’
$theSource.AccessKey = ‘/SourceKey:KEY’
$theDestination.path = ‘/Dest:https://STORAGENAME.file.core.windows.net/FILESTORAGENAME&#8217;
$theDestination.AccessKey = ‘/DestKey:KEY’

$theSource.recursion = ‘/S /V /XO’
$supressConfirmationPrompt = ‘/Y’
$listingOnlyOption = ” # or /L – use this option if you just want to list.

$arguments = $theSource.path + ” ” + $theDestination.path + ” ” + $theSource.AccessKey + ” ” + $theDestination.AccessKey + ” ” + $theSource.recursion + ” ” + $supressConfirmationPrompt + ” ” + $listingOnlyOption

$pinfo = New-Object System.Diagnostics.ProcessStartInfo
$pinfo.FileName = “C:\AzCopy\AzCopy.exe ”
$pinfo.Arguments = $arguments

$pinfo.RedirectStandardError = $true
$pinfo.RedirectStandardOutput = $true
$pinfo.UseShellExecute = $false

$p = New-Object System.Diagnostics.Process
$p.StartInfo = $pinfo
$p.Start() | Out-Null
$p.WaitForExit()
$stdout = $p.StandardOutput.ReadToEnd()
$stderr = $p.StandardError.ReadToEnd()
Write-Host “stdout: $stdout”
Write-Host “stderr: $stderr”
Write-Host “exit code: ” + $p.ExitCode

if you dont want to use storage key but SAS keys, replace /SourceKey and /DestKey with /SourceSAS and /DestSAS

Then to schedule this job in jenkins, use the following to run it as batch command -Powershell.exe  -NonInteractive -ExecutionPolicy Bypass -File C:\JenkinsJobs\LLProdDatabaseBackups.ps1

Move linux logs to AWS S3

One of the best and cheapest place i have found for backing up data is AWS S3. Its cheap, reasonalbly fast and easy to manage using command line, shell scripts and powershell.

Below we see the steps to move daily generated logs to AWS S3. We will be using S3cmd, linux based s3 client to copy data to S3.

To install S3cmd use the steps.

  1. As a superuser (root) go to /etc/yum.repos.d
  2. Download s3tools.repo file for your distribution. Links to these .repo files are in the table above. For instance  wget http://s3tools.org/repo/RHEL_6/s3tools.repo  , if you’re on CentOS 6.x
  3. Run yum install s3cmd.

then run s3cmd –configure to add your accesskey and security key for the s3 bucket

next, copy all the log files using the following syntax

s3cmd put log* s3://prod-logs/

here ‘log’ is the prefix for the log files and prod-logs is my bucket name.

Next if you want to remove the logs moved to S3 bucket using the below

rm -rf log*

 

If you want to change this to a batch script to move files daily to S3 bucket, follow the steps as per below-

Write a Shell Script to do copy/move to s3 and set permissions

#!/bin/sh

s3cmd put log* s3://prod-logs/

rm -rf log*

Make the script an Executable

$ chmod +x ScriptName.sh

Run at mid night  every day by adding to Cron Job

$ crontab -e

59 23 * * * /ScriptName.sh

 

Save cronjob

 

Above will move all your daily logs to S3.