Deploy PHP Zend framework on Azure using VSTS

PHP website using Zend framework can be deployed to Azure in many ways. For frontend, we may utilize either of Virtual Machines, Virtual Machine Scale Sets, Web Apps for Linux, Azure Container Instance or Azure kubernetes services. The first 2 here are Infrastructure as a service offering (IAAS) , while others are Platform as a service (PAAS). In this article, we will focus on deploying Websites using PHP open source framework, Zend (Version 2), to various azure PAAS offerings.

The end result would be like below –

zendOnAzure

Also we will be focusing on immutable deployment here. We will take the ARM template file which is part of commit, create environment from scratch and restore/deploy database/frontend. Everything will be done by VSTS and we will highlight the steps in detail on how its done.

Before we look into VSTS, lets take a quick look on our repository

zendOnAzure-CodeReview

DockerFile –

below will pull an existing lamp image, copy my code to image, download the config file containing mysql credentials that is created as part of deployment and then start lamp servers.

FROM fauria/lamp
COPY . /var/www/html
CMD curl http://containersforphpprereq.blob.core.windows.net/settings/config.ini -o /var/www/html/public/config.ini && /usr/sbin/run-lamp.sh

DockerFile of the image that i have used can be found here –

https://hub.docker.com/r/fauria/lamp/~/dockerfile/

kubeUpdate.yaml –

this file pulls the docker image and update aks cluster.

zendOnAzure-k8Yaml

Before we see the steps in details, please note 3 azure components are created separately and are not part of this deployment. VNET, Storage and ACR. This is because, in my opinion, these resources should be planned beforehand and setup to meet application/enterprise requirements. Rest everything else can be taken care of later. However if you want, you can ignore this and add extra steps to ARM template to create these resources.

Now let us look at our build pipeline which will carry out the following steps

  • create docker image which will be mounted on webapp, ACI and AKS
  • push this image to ACR
  • create the config file which will have mysql credentials and store it to a storage account
  • run ARM template to deploy azure resources.
  • deploy code to ACI, AKS and webapps
  • restore database to MYSQL
  • delete the config file from azure storage.

Below are the variables setup for my build pipeline. I have set all the credentials and keys here so that i don’t have to hard code them in my ARM template and I keep them blank in ARM. We will see later below how to override ARM template parameters.

0

1. In step 1 and 2, we will create the docker image and push it to ACR. We also have option to any other container registry like docker.

1

2. next we create the config.ini file. This file has credentials to mySQL and it will be pulled by containers when they start. website will read this config.ini file to connect to database.

3

3. then we use our ARM template to rollout the azure environment. this will also set docker image made in step 1 to ACI. Note how in Override Template Parameters step, I have overridden few sensitive parameter values like SQL creds, ACR creds and SPN. I keep them blank in templates as i don’t want to save this to git.

4

4. Once my environment rolls out, i restore my database to MYSQL server. my mysql backup file is committed along with code in git. In many scenarios, it is saved as blobs to storage. Just update the bash script in this step if your database backup is stored on blob.

5

5. next we deploy our code to web app. Look at the option ‘App Service Type’ in this task. Selecting web app for containers allows us to use a docker image and mount it on web app.

6

6. now we deploy code to AKS. we use the yaml file which is part of our code in git and apply it to AKS. YAML file, which has deployment details will take the image from ACR and apply it to AKS. See 2 key steps here. One in line 2 where we get the credentials and then in line three where we set the ‘secretname’ value to ACR creds. See above on how this secret is used to get the image for deployment.

7

7. Lastly we remove the file. Note we can skip this file creation step altogether by passing SQL creds to docker image as ENV variables and then using those values in php code.

8

Advertisements

Deploy to Azure ILB ASE using Visual Studio Online Services.

Few months back azure released a new offering called Application service environment. More details on it can be found here : https://docs.microsoft.com/en-us/azure/app-service/environment/intro

The Azure App Service Environment is an Azure App Service feature that provides a fully isolated and dedicated environment for securely running App Service apps at high scale. This capability can host your web apps, mobile apps, API apps, and functions.

App Service Environment, or ASE, have one new feature which was missing earlier. This is called as Internal ASE, which, allows you to deploy web application with an internet-accessible endpoint or with an IP address in your Virtual Network. This means you can create web apps like before but with no external endpoint.

However, since internal ASE is completed isolated, automated deployments and CI/CD pipelines need bit of work to setup. In Internal ASE, the publishing endpoint isn’t internet accessible. So, what we can do is make a build agent for VSTS in our v-net on the subnet that is internet accessible. And then let this agent, pull the code or build artefacts and deploy to ILB ASE. Since ILB ASE and our custom agent are on the same v-net, agent will be able to communicate to the internal ILB ASE deployment endpoint.

Please refer to below on how to implement this.

  1. Below is screenshot my test account. 3 main resources to point out here are the ilbasev2 (the ILB ASE environment), ilbasev2appname (the web app where I ll host my demo site) and jumpbox (VM hosted on the DMZ subnet to act as build agent and push to internal endpoint).

1

  1. Next we will login to the jump box and create VSTS build agent. I want to connect this agent to my VSTS account to make sure all the build and deployment is happening via this agent. Please refer to below URL on how to setup the agent. Since I will demo using a .net MVC application, I am going to create a windows agent.

https://docs.microsoft.com/en-us/vsts/build-release/actions/agents/v2-windows?view=vsts

Also make sure MSBuild is installed on this VM, so that we can build our MVC solution. https://msdn.microsoft.com/en-us/library/dd393574.aspx

  1. Once above is done, navigate to Agent queues screen under settings as per the screenshot below in your VSTS account and confirm that agent is setup correctly and connected to VSTS.

2

  1. Next we will make our build definition. We will create a simple build definition which will pull our demo MVC application code, restore packages and publish the artefact. The only thing to note here is the MSBuild argument. In case you are using a self-signed certificate for ILB ASE, you may want to add additional argument with MSBuild to make sure you don’t get any certificate related issues during build. The argument to use is ‘AllowUntrustedCertificate’ with value true.

The other way to go around self-signed certificate is to just manually install the certificate in your build agent server trusted certificate folder.

More details on that here : https://blogs.technet.microsoft.com/sbs/2008/05/08/installing-a-self-signed-certificate-as-a-trusted-root-ca-in-windows-vista/

3

  1. Once the build definition is setup, we can queue to execute it. Before we do that one important thing is to set the Agent Queue. Make sure it is the build agent that we have setup earlier. That will make sure our build agent is working as expected for later steps. We may use the hosted agent for builds too though for this step.

4

  1. Below is what you see if everything is working ok.

5

  1. Next we focus on release and deployment to ILB ASE. Below is a simple release definition with more details in next point. MY environment here the Azure ILB ASE that I am targeting the deployment too

6

 

 

 

  1. All this release does is deploy our artefact to the azure web app created using ILB ASE. One key thing to point out is the Agent Queue Parameter. Make sure it is pointed to our build agent that we created earlier.

8

  1. Create a new release using above release definition and you should be successfully able to move your code to ILB ASE.

9

  1. If you don’t use the build agent that we created earlier but use VSTS hosted one, then you should see this error. This is because the VSTS agent could not connect to the internal deployment endpoint of ILB ASE.

10

 

Azure SQL Data Warehouse

data : https://rbdevdwhstorage.blob.core.windows.net/rbdevdwhtestdata/FlightData.csv

In my 3rd example, I explored SQL Data warehouse on AZURE.

The idea was to take a half a million rows CSV and then process it using SQL Data Warehouse and see the performance.

I provisioned one azure storage where i saved my test data and then one azure sql data warehouse of 100 dTUs. (least in data warehouses).

Snapshot of data stored to Azure Storage –

w1

Azure SQL Data Warehouse. Copy the Server Name from here and open management studio.

w2.jpg

Using SQL Management studio, connect to this data warehouse like any MSSQL database.

w3.jpg

 

Once you are connected, open new query windows.

First step is to securely connect to the source data, here a csv with half a million rows.

CREATE MASTER KEY;

CREATE DATABASE SCOPED CREDENTIAL AzureStorageCreds
WITH IDENTITY = ‘[identityName]’
, Secret = ‘[azureStorageAccountKey]’
;

CREATE EXTERNAL DATA SOURCE azure_storage
WITH
(
TYPE = HADOOP
, LOCATION =’wasbs://[containername]@[accountname].blob.core.windows.net/[path]’
, CREDENTIAL = AzureStorageCreds
);

next define the format of the file we will be reading

CREATE EXTERNAL FILE FORMAT text_file_format
WITH
(
FORMAT_TYPE = DELIMITEDTEXT
, FORMAT_OPTIONS (
FIELD_TERMINATOR =’,’,
STRING_DELIMITER = ‘”‘,
USE_TYPE_DEFAULT = TRUE
)
);

next step is to make an external table, which is kind of a view to our source data

CREATE EXTERNAL TABLE FlightData
(
[Year] varchar(255),
[Quarter] varchar(255),
[Month] varchar(255),
[DayofMonth] varchar(255),
[DayOfWeek] varchar(255),
FlightDate varchar(255),
UniqueCarrier varchar(255),
AirlineID varchar(255),
Carrier varchar(255),
TailNum varchar(255),
FlightNum varchar(255),
OriginAirportID varchar(255),
OriginAirportSeqID varchar(255),
OriginCityMarketID varchar(255),
Origin varchar(255),
OriginCityName varchar(255),
OriginState varchar(255),
OriginStateFips varchar(255),
OriginStateName varchar(255),
OriginWac varchar(255),
DestAirportID varchar(255),
DestAirportSeqID varchar(255),
DestCityMarketID varchar(255),
Dest varchar(255),
DestCityName varchar(255),
DestState varchar(255),
DestStateFips varchar(255),
DestStateName varchar(255),
DestWac varchar(255),
CRSDepTime varchar(128),
DepTime varchar(128),
DepDelay varchar(128),
DepDelayMinutes varchar(128),
DepDel15 varchar(255),
DepartureDelayGroups varchar(255),
DepTimeBlk varchar(255),
TaxiOut varchar(255),
WheelsOff varchar(255),
WheelsOn varchar(255),
TaxiIn varchar(255),
CRSArrTime varchar(128),
ArrTime varchar(128),
ArrDelay varchar(255),
ArrDelayMinutes varchar(255),
ArrDel15 varchar(255),
ArrivalDelayGroups varchar(255),
ArrTimeBlk varchar(255),
Cancelled varchar(255),
CancellationCode varchar(255),
Diverted varchar(255),
CRSElapsedTime varchar(255),
ActualElapsedTime varchar(255),
AirTime varchar(255),
Flights varchar(255),
Distance varchar(255),
DistanceGroup varchar(255),
CarrierDelay varchar(255),
WeatherDelay varchar(255),
NASDelay varchar(255),
SecurityDelay varchar(255),
LateAircraftDelay varchar(255),
FirstDepTime varchar(255),
TotalAddGTime varchar(255),
LongestAddGTime varchar(255),
DivAirportLandings varchar(255),
DivReachedDest varchar(255),
DivActualElapsedTime varchar(255),
DivArrDelay varchar(255),
DivDistance varchar(255),
Div1Airport varchar(255),
Div1AirportID varchar(255),
Div1AirportSeqID varchar(255),
Div1WheelsOn varchar(255),
Div1TotalGTime varchar(255),
Div1LongestGTime varchar(255),
Div1WheelsOff varchar(255),
Div1TailNum varchar(255),
Div2Airport varchar(255),
Div2AirportID varchar(255),
Div2AirportSeqID varchar(255),
Div2WheelsOn varchar(255),
Div2TotalGTime varchar(255),
Div2LongestGTime varchar(255),
Div2WheelsOff varchar(255),
Div2TailNum varchar(255),
Div3Airport varchar(255),
Div3AirportID varchar(255),
Div3AirportSeqID varchar(255),
Div3WheelsOn varchar(255),
Div3TotalGTime varchar(255),
Div3LongestGTime varchar(255),
Div3WheelsOff varchar(255),
Div3TailNum varchar(255),
Div4Airport varchar(255),
Div4AirportID varchar(255),
Div4AirportSeqID varchar(255),
Div4WheelsOn varchar(255),
Div4TotalGTime varchar(255),
Div4LongestGTime varchar(255),
Div4WheelsOff varchar(255),
Div4TailNum varchar(255),
Div5Airport varchar(255),
Div5AirportID varchar(255),
Div5AirportSeqID varchar(255),
Div5WheelsOn varchar(255),
Div5TotalGTime varchar(255),
Div5LongestGTime varchar(255),
Div5WheelsOff varchar(255),
Div5TailNum varchar(255)
)
WITH
(
LOCATION = ‘/’,
DATA_SOURCE = azure_storage,
FILE_FORMAT = text_file_format,
REJECT_TYPE = value,
REJECT_VALUE = 100000
);

Once this done, we will run one final command to pull data to data warehouse

CREATE TABLE FlightDataStaging
WITH (DISTRIBUTION = ROUND_ROBIN)
AS
SELECT * FROM FlightData

Once this is done, lets run a group by command on all the data and see how long it takes.

w4.jpg

you will notice it barely took 2 seconds.

If we want, we can also link our data warehouse to powerBI , and display results.

I have tried to run similar report as above and it came up in no time.

w5.jpg

Azure Data Lake

data : https://rbdevdwhstorage.blob.core.windows.net/rbdevdwhtestdata/FlightData.csv

 

In my second example I wanted to use Azure Data Lake PAAS offering and see how it can help me batch process data quickly.

Inorder to check that, I downloaded a test excel file which had info of flight departures and delys from many USA major airports. Excel had close to half a million rows and i want to see how quickly data lake can parse this file for me and then run simple/complex queries.

The first step is to provision Azure Data Lake Storage to save the CSV and results. And to provision Azure Data Lake Analytics to run all the batch jobs and generate results.

Once the Data Lake Storage has been provisioned, click on the Data Explorer button and Upload the test CSV data.

b1

Once the upload is done, go to Azure Data Lake Analytics and Click on New Job.

b2.jpg

We will focus on 3 jobs here.

  • First Job will make a view that will read from the CSV file we have uploaded.
  • Second with then read all the data from the view and save it a new table on azure data lake storage that we can query.
  • we will run a test query that will go through half a milion rows and produce results in few seconds.

All we need to do it copy paste the jobs below one by one and submit them. If commands are ok, you should see screens similar to below

Step 1-

below is the command to create the view.

DROP VIEW IF EXISTS FlightDataView_2014_1;

CREATE VIEW FlightDataView_2014_1
AS
EXTRACT Year int,
Quarter int,
Month int,
DayofMonth int,
DayOfWeek int,
FlightDate string,
UniqueCarrier string,
AirlineID int,
Carrier string,
TailNum string,
FlightNum string,
OriginAirportID int,
OriginAirportSeqID int,
OriginCityMarketID int,
Origin string,
OriginCityName string,
OriginState string,
OriginStateFips string,
OriginStateName string,
OriginWac int,
DestAirportID int,
DestAirportSeqID int,
DestCityMarketID int,
Dest string,
DestCityName string,
DestState string,
DestStateFips string,
DestStateName string,
DestWac int,
CRSDepTime string,
DepTime string,
DepDelay double?,
DepDelayMinutes double?,
DepDel15 double?,
DepartureDelayGroups string,
DepTimeBlk string,
TaxiOut double?,
WheelsOff string,
WheelsOn string,
TaxiIn double?,
CRSArrTime string,
ArrTime string,
ArrDelay double?,
ArrDelayMinutes double?,
ArrDel15 double?,
ArrivalDelayGroups string,
ArrTimeBlk string,
Cancelled double?,
CancellationCode string,
Diverted double?,
CRSElapsedTime double?,
ActualElapsedTime double?,
AirTime double?,
Flights double?,
Distance double?,
DistanceGroup double?,
CarrierDelay double?,
WeatherDelay double?,
NASDelay double?,
SecurityDelay double?,
LateAircraftDelay double?,
FirstDepTime string,
TotalAddGTime string,
LongestAddGTime string,
DivAirportLandings string,
DivReachedDest string,
DivActualElapsedTime string,
DivArrDelay string,
DivDistance string,
Div1Airport string,
Div1AirportID string,
Div1AirportSeqID string,
Div1WheelsOn string,
Div1TotalGTime string,
Div1LongestGTime string,
Div1WheelsOff string,
Div1TailNum string,
Div2Airport string,
Div2AirportID string,
Div2AirportSeqID string,
Div2WheelsOn string,
Div2TotalGTime string,
Div2LongestGTime string,
Div2WheelsOff string,
Div2TailNum string,
Div3Airport string,
Div3AirportID string,
Div3AirportSeqID string,
Div3WheelsOn string,
Div3TotalGTime string,
Div3LongestGTime string,
Div3WheelsOff string,
Div3TailNum string,
Div4Airport string,
Div4AirportID string,
Div4AirportSeqID string,
Div4WheelsOn string,
Div4TotalGTime string,
Div4LongestGTime string,
Div4WheelsOff string,
Div4TailNum string,
Div5Airport string,
Div5AirportID string,
Div5AirportSeqID string,
Div5WheelsOn string,
Div5TotalGTime string,
Div5LongestGTime string,
Div5WheelsOff string,
Div5TailNum string
FROM “/rbdevdls/FlightData.csv”
USING Extractors.Text(‘,’, null, null, null,
System.Text.Encoding.UTF8, true, false, 1);

Upon execution, you should see similar screen

b3

Once the view is created to the CSV file, we will dump this data to a Data Lake catalog table using command belows

DROP TABLE IF EXISTS FlightData_2014_1;

CREATE TABLE FlightData_2014_1(
INDEX idx_year CLUSTERED (Year)
DISTRIBUTED BY RANGE (Year)
) AS
SELECT *
FROM FlightDataView_2014_1;

you will note that Azure Data Lake will quickly dump half a million rows data to the table from excel.

b4.jpg

If you want to see where this data is, Click on Data Explorer in Data Lake, and then click on catalog. Under Catalog, expand Master->Tables and you should see it there.

Now last part to see the performance of azure Data lake. I write the query below to read all the half a millions rows and produce airport list, ordered by number of flights. We will save the result of this job to a CSV file ->busiestairports.csv

@results =
SELECT Origin, COUNT(*) AS Counted
FROM FlightData_2014_1
GROUP BY Origin
ORDER BY Counted DESC
FETCH 30 ROWS;

OUTPUT @results
TO “/rbdevdls/busiestairports.csv”
USING Outputters.Csv();

you will note that azure data lake will wrap that up in 46 secs and create the file for you too.

b5.jpg

Now we can go back to azure data lake storage and view the results

b6

Azure real time analytics

I have been thinking of exploring Azure PAAS offering in Data warehousing/analytics for a while and finally decided to go ahead with 2-3 examples that will help me cover most of the PAAS offering.

In the first example, I am using Event hubs and see if i can try some real time analytics there using azure stream analytics jobs. I assumed that I am sitting in a big mall where thermometers keep sending temperature readings frequently to Azure Event Hubs. Then i am using this data to check temperature every 2 minutes and see if there is no drop/rise in data that indicate some issue with cooling system/heaters.

I started off by provisioning a event hub with 4 partitions in Azure.s1

Then I made a code in c# which sends seed data to this Azure Event Hub in the same pattern a real thermometer/thermostat would do.s2.jpg

Data was send in JSON format. Here is the example of data send.

s3

Once I am sure data is being received by Events Hubs, I want to do some real time stream analytics and make sure that i get updates for every few minutes for each device.

For Stream Analytics in this example, I Went ahead with azure PAAS offering, Stream Analytics Jobs.

s4

Every Stream Analytics has 3 important setups. Input, thats where our real time data will be picked from (here an Event Hub). Output, where the results of analysis will be saved, (here Power BI) and a query/function which manipulates input data.

See below on how we have setup these 3 in stream analysis.

On the Event Hub Dashboard in portal click input and add these details to attach it to an event hub for input.

s5.jpg

Click on test to make sure input endpoint is valid. Then go back to Event Hub Portal and click Output. And then add Power BI as output.

s6

In this case we will setup a query to check Min, Max and Average Temperature every 2 minutes and publish results on power bi so that end user can check if temperature stats are ok.

See below for query –

s7

Save the Query and Then go back to Event Hub Portal. Start the Stream Analytics Service and Check The graph to make sure Input is processed and Output is generated.

s8.jpg

Once Output is generated, go to Power BI and see if dataset is created for this exercise and if its getting refreshed.

You should see a new Dataset created.

s9.jpg

We can now use this dataset to created any report. I have made a simple table to show data produced every 2 minutes.

s11.jpg

List all azure resources in a csv / excel

Run the following code to list all the azure resources under all of your resources.

# settings
$defaultPath= "c:\Temp\azureresources.csv"
$csvDelimiter= ';'
# set azure account
[void](Login-AzureRmAccount)
# receive all subscriptions
$subscriptions= Get-AzureRmSubscription
$subscriptions| ft SubscriptionId, SubscriptionName
# select azure subscriptions that you want to export
"Please enter subscription ids (comma separated, leave empty to use all subscriptions)"
$subscriptionIds= read-host
if([String]::IsNullOrWhiteSpace($subscriptionIds)) {
    $subscriptionIds= @($subscriptions| select-ExpandPropertySubscriptionId)
}
elseif($subscriptionIds.Contains(',')) {
    $subscriptionIds= $subscriptionIds.Split(',')
}
else{
    $subscriptionIds= @($subscriptionIds)
}
# configure csv output
"Enter destination path - leave it empty to use $defaultPath"
$path= read-host
if([String]::IsNullOrWhiteSpace($path)) {
    $path= $defaultPath
}
if(Test-Path$path) {
    "File $path already exists. Delete? y/n [Default: y)"
    $remove= read-host
    if([String]::IsNullOrWhiteSpace($remove) -or$remove.ToLower().Equals('y')) {
        Remove-Item$path
    }
}
"Start exporting data..."
foreach($subscriptionIdin$subscriptionIds) {
    # change azure subscription
    [void](Set-AzureRmContext-SubscriptionID$subscriptionId)
    # read subscription name as we want to see it in the exported csv
    $subscriptionName= ($subscriptions| Where { $_.SubscriptionId -eq$subscriptionId}).SubscriptionName
    
    $subscriptionSelector= @{ Label="SubscriptionName"; Expression={$subscriptionName} }
    $tagSelector=  @{Name="Tags";Expression={ if($_.Tags -ne$null) { $x= $_.Tags | %{ "{ `"" + $_.Name + "`" : `"" + $_.Value + "`" }, "}; ("{ "+ ([string]$x).TrimEnd(", ") + " }") } }}
    #get resources from azure subscription
    $export= Get-AzureRmResource| select *, $subscriptionSelector, $tagSelector-ExcludeProperty"Tags"
    $export| Export-CSV$path-Delimiter$csvDelimiter-Append-Force-NoTypeInformation
    "Exported "+ $subscriptionId+ " - "+ $subscriptionName
}
"Export done!"
If you want to run this script in scheduler, then you need to save azure profile so that the script can pick it up. Run the following commands
Add-AzureRmAccount
Save-AzureRmProfile -Path “c:\temp\azureprofile.json”
After checking if the file exists, the following (line 7 of the previous script) should load the azure profile:
Select-AzureRmProfile -Path $azureProfilePath

azure sql backup to azure storage

Use the following script to backup azure sql to storage.
replace database details and storage details in the scripts below.
Also you would need to import your azure publishing file to run this script.
refer to this article on how to azure publishing file.

https://devopsandcloud.wordpress.com/2017/01/21/download-and-import-publish-settings-and-subscription-information/

 

# Check if Windows Azure Powershell is avaiable
try{
Import-Module Azure -ErrorAction Stop
}catch{
throw “Windows Azure Powershell not found! Please make sure to install them from http://www.windowsazure.com/en-us/downloads/#cmd-line-tools”
}

Import-AzurePublishSettingsFile “C:\jenkinsjobs\Pay-As-You-Go.publishsettings” #replace with your publishing file path

$DatabaseServerName=”azureservername.database.windows.net”
$DatabaseName= “DBName”
$DatabasePassword=”azure sql password”
$DatabaseUsername=”azure sql user”
$StorageName=”storage name”
$StorageKey=”storage key”
$StorageContainerName=”containername”
$dateTime = get-date -Format u
$blobName = “$DatabaseName.$dateTime.bacpac”
Write-Host “Using blobName: $blobName”

# Create Database Connection
$securedPassword = ConvertTo-SecureString -String $DatabasePassword -asPlainText -Force
$serverCredential = new-object System.Management.Automation.PSCredential($DatabaseUsername, $securedPassword)
$databaseContext = New-AzureSqlDatabaseServerContext -FullyQualifiedServerName $DatabaseServerName -Credential $serverCredential

# Create Storage Connection
$storageContext = New-AzureStorageContext -StorageAccountName $StorageName -StorageAccountKey $StorageKey

# Initiate the Export
$operationStatus = Start-AzureSqlDatabaseExport -StorageContext $storageContext -SqlConnectionContext $databaseContext -BlobName $blobName -DatabaseName $DatabaseName -StorageContainerName $StorageContainerName

# Wait for the operation to finish
do{
if ($operationStatus)
{
$status = Get-AzureSqlDatabaseImportExportStatus -Request $operationStatus
if ($status){
Start-Sleep -s 3
$progress =$status.Status.ToString()
Write-Host “Waiting for database export completion. Operation status: $progress”
}
else
{
Write-Host “Null Status. Awating updates.”
}
}
}until ($status.Status -eq “Completed”)
Write-Host “Database export is complete”