azure sql backup to azure storage

Use the following script to backup azure sql to storage.
replace database details and storage details in the scripts below.
Also you would need to import your azure publishing file to run this script.
refer to this article on how to azure publishing file.

https://devopsandcloud.wordpress.com/2017/01/21/download-and-import-publish-settings-and-subscription-information/

 

# Check if Windows Azure Powershell is avaiable
try{
Import-Module Azure -ErrorAction Stop
}catch{
throw “Windows Azure Powershell not found! Please make sure to install them from http://www.windowsazure.com/en-us/downloads/#cmd-line-tools”
}

Import-AzurePublishSettingsFile “C:\jenkinsjobs\Pay-As-You-Go.publishsettings” #replace with your publishing file path

$DatabaseServerName=”azureservername.database.windows.net”
$DatabaseName= “DBName”
$DatabasePassword=”azure sql password”
$DatabaseUsername=”azure sql user”
$StorageName=”storage name”
$StorageKey=”storage key”
$StorageContainerName=”containername”
$dateTime = get-date -Format u
$blobName = “$DatabaseName.$dateTime.bacpac”
Write-Host “Using blobName: $blobName”

# Create Database Connection
$securedPassword = ConvertTo-SecureString -String $DatabasePassword -asPlainText -Force
$serverCredential = new-object System.Management.Automation.PSCredential($DatabaseUsername, $securedPassword)
$databaseContext = New-AzureSqlDatabaseServerContext -FullyQualifiedServerName $DatabaseServerName -Credential $serverCredential

# Create Storage Connection
$storageContext = New-AzureStorageContext -StorageAccountName $StorageName -StorageAccountKey $StorageKey

# Initiate the Export
$operationStatus = Start-AzureSqlDatabaseExport -StorageContext $storageContext -SqlConnectionContext $databaseContext -BlobName $blobName -DatabaseName $DatabaseName -StorageContainerName $StorageContainerName

# Wait for the operation to finish
do{
if ($operationStatus)
{
$status = Get-AzureSqlDatabaseImportExportStatus -Request $operationStatus
if ($status){
Start-Sleep -s 3
$progress =$status.Status.ToString()
Write-Host “Waiting for database export completion. Operation status: $progress”
}
else
{
Write-Host “Null Status. Awating updates.”
}
}
}until ($status.Status -eq “Completed”)
Write-Host “Database export is complete”

Advertisements

backup azure storage

I always keep a backup of my azure storage so that in case code deletes something by mistake in azure storage, i have the backup ready to get the file from. Even though azure replicates storage, but it is not fail proof in case of manual deletion and the replica will remove the blob too.

I use AZCOPY to move my data from storage to storage or storage to azure file. This is then run as a 6 hourly job to sync data with storage, giving me enough time to get copy from manual replica in case I delete something.

to download azcopy, go to this link – http://aka.ms/downloadazcopy

then use this powershell script to run AZcopy. (replace with your source and destination storage)

$theSource = @{path=”; accessKey=”; recursion=”; pattern=”}
$theDestination = @{path=”; accessKey=”}

$theSource.path = ‘/Source:https://STORAGENAME.blob.core.windows.net/CONTAINERNAME’
$theSource.AccessKey = ‘/SourceKey:KEY’
$theDestination.path = ‘/Dest:https://STORAGENAME.file.core.windows.net/FILESTORAGENAME’
$theDestination.AccessKey = ‘/DestKey:KEY’

$theSource.recursion = ‘/S /V /XO’
$supressConfirmationPrompt = ‘/Y’
$listingOnlyOption = ” # or /L – use this option if you just want to list.

$arguments = $theSource.path + ” ” + $theDestination.path + ” ” + $theSource.AccessKey + ” ” + $theDestination.AccessKey + ” ” + $theSource.recursion + ” ” + $supressConfirmationPrompt + ” ” + $listingOnlyOption

$pinfo = New-Object System.Diagnostics.ProcessStartInfo
$pinfo.FileName = “C:\AzCopy\AzCopy.exe ”
$pinfo.Arguments = $arguments

$pinfo.RedirectStandardError = $true
$pinfo.RedirectStandardOutput = $true
$pinfo.UseShellExecute = $false

$p = New-Object System.Diagnostics.Process
$p.StartInfo = $pinfo
$p.Start() | Out-Null
$p.WaitForExit()
$stdout = $p.StandardOutput.ReadToEnd()
$stderr = $p.StandardError.ReadToEnd()
Write-Host “stdout: $stdout”
Write-Host “stderr: $stderr”
Write-Host “exit code: ” + $p.ExitCode

if you dont want to use storage key but SAS keys, replace /SourceKey and /DestKey with /SourceSAS and /DestSAS

Then to schedule this job in jenkins, use the following to run it as batch command -Powershell.exe  -NonInteractive -ExecutionPolicy Bypass -File C:\JenkinsJobs\LLProdDatabaseBackups.ps1

Azure File Service on linux

You can now using azure file services to make large drives and mount them to linux server. When i was testing, it was available for many windows versions and very few version of linux.

Linux VMs deployed on Azure can make use of this service using the Linux Kernel CIFS client. The kernel client must be configured to support and use the SMB 2.1 protocol dialect:

CONFIG_CIFS_SMB2 must be enabled in the kernel configuration at build time
Use
# zcat /proc/config.gz | grep CONFIG_CIFS_SMB2
to check this on a running system.
The vers=2.1 mount.cifs parameter must be provided at mount time.
Furthermore, the Azure storage account and access key must be provided as username and password.

# mount.cifs -o vers=2.1,user=smb //smb.file.core.windows.net/share /share/
Password for smb@//smb.file.core.windows.net/share: ******…
# df -h /share/
Filesystem Size Used Avail Use% Mounted on
//smb.file.core.windows.net/share 5.0T 0 5.0T 0% /share