Share via


PowerShell works for Amazon AWS S3 too!

Technorati Tags: azure blob store,azure storage,aws,s3,neal analytics,windows azure,powershell

More and more we have to work with data in many different locations. This week I got to work with S3 files that were moving to Azure blob storage. I was surprised to find that Amazon has published AWS cmdlets for PowerShell. It took me a little while to figure out the format and terminology so I'll try to explain that and compare/contrast how we interact with storage in AWS and Azure. Today we will cover viewing the files.

Configure PowerShell

Well first, let's get things set up. Install the Azure and AWS cmdlets for PowerShell. These examples will pass keys for everything so there's no need to configure PowerShell with certificates to access the clouds.

The first time (depending on your PowerShell version) you use PowerShell after installing AWS cmdlets you may need to run these cmdlets:

Add-Type -Path "C:\Program Files (x86)\AWS SDK for .NET\bin\Net45\AWSSDK.dll"
Import-Module "C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1"

Connecting to Storage

S3

We'll start with AWS S3. Each connection to S3 storage requires an AWS region (unless you use the default "US Standard", an access id (unique identifier), a secret key, and a bucket. You are storing data within a specific region on an access point in a managed grouping called a bucket. The access id in S3 is equivalent to a storage account name in Azure. A bucket in S3 is roughly equivalent to a container in Azure.

$S3Bucket = "MyBucket"
$S3Key="SecretKeyValue"
$S3AccessID="AccessKey"
$AWSregion = "us-west-2"

Next let's use those values to make a new client connection to S3. You define a configuration object that points to the full URL for the region. Then you pass that configuration object, the access id, and the secret key to a function that creates a client connection to S3. This sets the context for the entire session and the context does not have to be passed to the individual commands. Note that the URL changes depending on the region, for example https://s3-us-west-2.amazonaws.com

Set-DefaultAWSRegion $AWSregion # auto-stored to $StoredAWSRegion
$AWSserviceURL="https://s3-$AWSRegion.amazonaws.com"
$config=New-Object Amazon.S3.AmazonS3Config
$config.ServiceURL = $AWSserviceURL
$S3Client=[Amazon.AWSClientFactory]::CreateAmazonS3Client($secretKeyID, $secretAccessKeyID, $config)

Azure

Let's compare that to how we list files in Azure blob storage. First you specify the location and credentials. The region is implied because the storage account name is unique across all regions. The container and secret key value are similar in meaning.

$storageAccountName = "MyStorageAccountName" 
$storageaccountkey = "SecretKeyValue"
$containerName = "MyBucket"

Then you define the storage context which is the location and credentials of an object. Alternatively you could set the default storage context for the session or for a particular profile's connection to a given subscription.

$AzureContext = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountkey

View the Files

S3

Now you can get basic metadata about the S3 bucket:
Get-S3Bucket $S3Bucket
Get-S3BucketLocation $S3Bucket

Next let's list the files in that bucket.

Get-S3Object -BucketName $S3Bucket

You can populate an array with the list, in this example I passed in just the name (key) of each file:
$S3FileList = (Get-S3Object -BucketName $S3Bucket).key

And you can filter the result set:
$S3FileList = (Get-S3Object -BucketName $S3Bucket | Where-Object {$_.lastmodified -lt "2/17/2015"}).Key
$S3FileList = (Get-S3Object -BucketName $S3Bucket | Where-Object {$_.key -like "*42*"}).Key

Azure

For Azure we can do similar operations to view the files. This example lists all files in the container:

Get-AzureStorageBlob -Context $AzureContext -Container $containerName

You can also populate an array with the list:

$AzureList = Get-AzureStorageBlob -Context $AzureContext -Container $containerName

Or pull out just a single property:

(Get-AzureStorageBlob -Context $AzureContext -Container $containerName).Name

Or list just blobs that match a wildcard value:

Get-AzureStorageBlob -Context $AzureContext -Container $containerName -Blob *42*

My Work Here is Done

This intro to PowerShell for S3 opens up the door to many possibilities - data migrations, multi-cloud hybrid solutions, and whatever your imagination can conjure up! Today we reviewed how to view files, I'll cover more in future posts. Happy PowerShelling!

Tip

When you open "Microsoft Azure PowerShell" type ISE in the window to launch the interactive PowerShell shell. It has intellisense, multiple script windows, and a cmdlet viewer.

Comments

  • Anonymous
    October 20, 2015
    Not sure if it's a AWS powershell version issue, but I am getting Error "Unable to find type  [Amazon.AWSClientFactory]: make sure that the assembly containing this type is loaded." Any help please.