Category Archives: powershell

Create an Azure Storage Blob Container with PowerShell

My observations so far with the Azure PowerShell experience have been somewhat mixed and the example in this post will give you a flavour of that. I wanted to create a new Storage Blob Container via PowerShell, rather than through the below process in the web portal:

I looked for cmdlets which could potentially be used:

However, it returned nothing from the AzureRM module, only the Azure module. (There are currently two modules you need to use when working with Azure, some more info here and here) To say this can get confusing when you are new to the topic is an understatement, hopefully this situation is going to improve significantly ASAP.

So it looks like I need to use New-AzureStorageContainer from the original Azure module, however there do not appear to be any examples which show you how to add it into the desired place, i.e. Resource Group and Storage Account:

So far I have found two different ways to get this done:

1)Set the current Storage Account

I found a StackOverflow post with an example. You need to first of all call a cmdlet from the AzureRM module to set the current Storage Account (note line 2 is the weird response you get from running the command in line 1, i.e. just a string with the name of the current Storage Account, not an object representing it):

Now I can use New-AzureStorageContainer and it will get created in the correct place:

2) Use Storage Account Keys

Within a Storage Account are two Access keys which can be used for automation:

We only need one of the keys, but the following will retrieve both and then we pick out the first key value:

Now using one of the key values we can set the Storage Context:

Note: the above doesn’t actually seem to perform any validation on whether a Storage Account with that name exists. I initially had a typo in  the name and when using the next command generated the error: New-AzureStorageContainer : The remote name could not be resolved: ‘’

Now if we have used the correct name for an existing Storage Account we can create the Storage Container using the generated Storage Context:

Please leave a comment if I have missed an easier way to do it, I’d love to know 🙂

New-AzureRmResourceGroupDeployment : A parameter cannot be found that matches parameter name

New-AzureRmResourceGroupDeployment generates the following error:

New-AzureRmResourceGroupDeployment `
-Name $resourceDeploymentName `
-ResourceGroupName $resourceGroupName `
-TemplateFile $template `
@additionalParameters `
-Verbose -Force
New-AzureRmResourceGroupDeployment : A parameter cannot be found that matches parameter name ‘xxxxxxxxxxx’.
At line:5 char:5
+ @additionalParameters `
+ ~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (:) [New-AzureRmResourceGroupDeployment], ParameterBindingException
+ FullyQualifiedErrorId : NamedParameterNotFound,Microsoft.Azure.Commands.ResourceManager.Cmdlets.Implementation.NewAzureResourceGroupDeploymentCmdlet

This kind of error seems fairly in tune with the experience I have had so far with the AzureRM PowerShell module, i.e. the error has seemingly nothing to do with the actual problem. While I spent a fair amount of time checking the parameter ‘xxxxxx’ in the ARM JSON file and found nothing wrong, it turned out that a syntax error elsewhere in the file was causing the problem. An error message pointing to that kind of problem would have been a lot more helpful!

Solve the syntax issue and this error goes away.

PowervRA – Now available on OS X and Linux via PowerShell Core!

Back Story

For a while Craig and I have had a number of requests regarding offering OS X and Linux support to PowervRA, particularly since in case you weren’t aware PowerShell is now available on those OSs and 3rd party modules such as PowerCLI are heading towards supporting that. We first looked at offering this support for PowervRA when the first Alpha release of PowerShell Core was shipped, however we were blocked by a couple of issues, particularly this one regarding certificate checking.

However, back in December I read about how the guys who maintain PowerNSX had been able to offer PowerShell Core support and they had also been blocked by that same issue which has now been resolved. So we updated to PowerShell Core Alpha 14 and started testing again – it seemed that another blocking issue around JSON responses had also been resolved, so things were looking good.

As it turned out, there weren’t a significantly large amount of changes which actually needed to be made on our side – some changes to the Connect-vRAServer and Invoke-vRARestMethod functions to do different things depending on whether PowerShell Core is being used. The scale of community feedback to the alpha releases of PowerShell Core and the efforts of the PowerShell team at Microsoft look like they have had a great impact and covered off a lot of the issues we might have had and the feedback has been quickly taken into subsequent alpha releases.

We benefited from the fact that we have previously invested quality time in producing integration tests for PowervRA, consequently we were able to run the same tests using a PowerShell Core client and only ended up with a couple of bugs that we are currently unable to resolve (here and here) , but looks like at least one of them is scheduled to be fixed for us via a milestone of PowerShell Core beta.

So in release 2.0.0 of PowervRA we are very pleased to bring you support for PowerShell Core!


You will need:

PowerShell Core Alpha 14 + ….instructions on getting it installed for different OS flavours can be found here.

PowervRA 2.0.0 + . Get a copy of PowervRA onto the Linux  or OS X machine you want to run it from. Use the following to download it  from the PowerShell Gallery:

or manually copy the module yourself to one of the locations listed in $env:PSModulePath, for example:

In Action


Here’s PowervRA on my Macbook:

Connect to vRA

Retrieve Blueprint Details

Update a Reservation Policy

Ubuntu 16.04

Here’s PowervRA on Ubuntu 16.04:

Connect to vRA

Retrieve Business Group Details

Update a Network Profile



Craig has done some cool work to make PowervRA available via Docker. Check out his blog post for more details.

Also, many thanks to Alan Renouf for suggesting PowervRA now be made available hosted in the PowerCLI Core Docker Hub too.

Side Note

In PowervRA 2.0.0 we have also made some under the hood changes that it is worth being aware of (check the changelog for more details):

  • Module Restructure Part 1: we changed the functions from being their own individual nested modules in *.psm1 files to more simply being in *.ps1 files and made part of the module in a different way. This was a way I had historically put my modules together, but now have spent some time improving on it to a better way.
  • Module Restructure Part 2:  a number of functions had been marked in recent releases as deprecated, they have now been removed.
  • Module Restructure Part 3: we had previously started moving the functions into folders based on their API endpoint, this is now complete across all of the functions:


We believe we have covered off most issues with using PowervRA on PowerShell Core via our testing process, but if you do experience anything we have missed then please let us know here.

We are aware of one issue with running PowervRA on CentOS, which appears to be something not just relevant for us and should get fixed upstream in .Net Core.


We’re really pleased to be able to bring this support to PowervRA and much kudos to the PowerShell team and the wider community for making it both possible and relatively straightforward. We hope you find it useful given we know a significant part of our potentially user base are OS X users.

Also stay tuned because we are not stopping there. There is other planned new PowervRA functionality on the horizon ……

ConvertTo-Json – Working with the Depth Parameter

A couple of times I have got tripped up by the fact that the Depth parameter for ConvertTo-Json has a default value of 2. So for an object something like this with multiple sub-objects, you will have problems if you don’t specify a higher value for that parameter.

If we send the original object through to ConvertTo-Json with the default value for Depth, then we’ll get the following and you’ll observe that only the first two levels have been dealt with properly:

Using a Depth parameter set to level 10 we get a better result:

So mostly it is just a case of remembering that it may be required to use the Depth parameter. A default value of 2 seems a little low, but I guess there must be a reason for it. In practical terms, I got a bit lazy with this and rather than check what the exact value should be each time, I set a high value which I knew would never be reached, let’s say 200. However, some changes in PowerShell seem to have been introduced since the WMF 5.1 preview / 6.0 alpha which results in the following error:

The maximum depth allowed for serialization is 100.

So it appears that there is now a maximum value of 100 for the Depth parameter. Re-working all mentions from 200 to 100 resolved the problem.


PowerShell Brickset Module – Part 3: Working with the Inventory

In part 1 of this series, we looked at how to get started with the Brickset module. In part 2 we examined how to easily download sets of instructions. Now in part 3 I’ll show you how to use the inventory features of Brickset.

When you are logged into the Brickset website you can use the inventory features to help keep track of your collection. For example if you look at a particular set, you can mark the number of copies of that set you own, or if you don’t own it, mark it down as a set that you want:

Also at the top of the page you’ll see a summary of the number of sets owned and wanted. Each is a clickable hyperlink which will provide you with a nice view of what you own or are looking to get hold of.

Included in the PowerShell Brickset module are a set of functions for working with this functionality. First of all Get-BricksetCollectionTotals will give an overview on existing totals:

Get-BricksetSetOwned will give details on owned sets:

Get-BricksetSetWanted will give details on sets wanted:

Of course these are only useful if you have already populated your Brickset Inventory via the website. While it’s a pretty simple experience in the website to do that for individual sets, there could be a lot of clicking to do if you have a large amount of sets to upload. Step forward Set-BricksetSetOwned. In it’s simplest form, to add one set:



(Note: I don’t think the ‘xxx people own this set’ stat updates instantly whether you update your set ownership via the website or the API)

Let’s say we now want to do a bulk upload of everything we own and we have a data set of our Lego sets contained in Excel. We could take that data and fire it at Set-BricksetSetOwned to do the bulk update. Here’s an example CSV file with a small amount of data to illustrate the process:

and the following code will mark each of those sets as owned, with the number owned as the quantity:

and now our collection is starting to look a bit more respectable:


All that’s left for me to do is get a list of all of my owned sets into that CSV file and uploading to Brickset will take just a few seconds 🙂

PowerShell Brickset Module – Part 2: Downloading Instructions

In part 1 of this series, we looked at how to get started with the Brickset module. In part 2 we’ll take a look at how to easily download sets of instructions.

It may be the case that you have lost the set of instructions for a Lego set or perhaps you have got hold of a set secondhand that didn’t have the instructions to accompany it. It’s possible to do that for an individual set via the Lego website, but with this Brickset module you can do it from the comfort of your own PowerShell session, which can be particularly handy if you need to get more than just one set of instructions.

In its simplest form you can do this for a single set :

which gives you back two URLs. The first is for part 1 of the instructions, the second for part 2 – sometimes the larger sets ship with multiple instruction books.

You could of course paste these URLs into your browser to download them, but an easier way would be this:

which will produce this in your default browser, a tab for each pdf:

This is quite handy, but you’ve still probably got to save them away manually into a folder somewhere for future reference.

Let’s take this a step futher. By using the Theme parameter of Get-BricksetSet and a native PowerShell cmdlet Start-BitsTransfer we can download the sets of instructions for an entire  Lego theme into a specified folder:



PowerShell Brickset Module – Part 1: Getting Started

I recently gave my PowerShell Brickset Module a much needed overhaul, so thought it was worth putting a few posts out on how it works and what you can do with it. In part 1 we’ll look at getting started, including download and installation.

Brickset is an extremely useful site for keeping up-to-date with Lego based news and managing your own collection of brick based goodness. In addition to the browser based content, they also offer a SOAP based API.  My PowerShell Brickset Module takes advantage of this API and provides a number of functions for working with different parts of the site. The recent overhaul of the module includes functions for working with the inventory side of things, i.e. the part which requires a login – the previous version was based only around functions which needed an API key.

Brickset Requirements

To take advantage of all of the module functions you will require both a Brickset account and an API key. Fill out the form here  for an API key and they will send you one. Currently they are free.


The easiest way to install the Brickset module is to use PowerShell v5 and get it from the PowerShell Gallery. To do that is a single command:

otherwise you can get it from Github, instructions in the previous post.

Once installed, import the module and check out the available commands:

Brickset Connection

One of the new functions in this release is Connect-Brickset. This replaces Set-BricksetAPIKey and handles the API key, the User login (if supplied) and the Webservice connection. Run Connect-Brickset and both your API key and user login will be available for all connections in the session, stored via the $BricksetConnection variable:

Use Cases

You can still do all of the same things as in the previous release, such as getting the number of Lego sets per theme:

To drill down into detailed information for a range of sets, we could do something like this:

In part 2 we will look at getting hold of instruction PDFs and Part 3 onto some of the new functions that require a login for managing your inventory…..

Join the PowerShell 10 Year Anniversary Online Event

It seems hard to believe, but this year sees PowerShell having been around for 10 years! From the early beginnings of increasing awareness and adoption, through becoming a fundamental part of Windows and now having been made Open Source, it’s been quite a journey.

To celebrate this, the PowerShell team has arranged a day of online streaming events on their Channel 9 platform: Monday November 14th from 8:00 am to 4:00 pm (PST).

There will be opportunities to hear the team members talk about how the product has evolved, and some of the MVPs talk about community involvement and the new Open Source engagement. More details can be found from the two links below and checking out the  #PowerShell10Year hashtag on Twitter:

Join the PowerShell 10th Anniversary Celebration!





Using Pester and PowervRO as a Unit Test Framework for vRO

vRealize Orchestrator doesn’t have an in-built Unit Test Framework, however I realised that it might be possible to use a combination of Pester and PowervRO for now to achieve similar results. Let’s take a look at an example using a very simple workflow, Workflow1. Workflow1 has two inputs, a and b, both numbers:


Workflow1 has a single scriptable task that takes the inputs a and b, multiples them together and stores the result in c, which is output from the workflow.


We can write the following PowerShell based test using the Pester framework and PowervRO to test that the result of running Workflow1 with inputs of a and b, should be the value we supply for c. We make a connection to the vRO Server in question, invoke the workflow and then check the result:

We can supply variables to the test via a JSON file, so its simple to then take this test to other vRO servers, or change the values we are testing:

Now we can invoke the Pester tests and check the results:


Obviously this process could be scaled out to many more workflows with different inputs and outputs.


Create Blueprints in vRA7 with PowervRA

Update 29/09/2016:

The API documentation for importing a vRA Content Package contains a warning:

At this point, we don’t support any form of rollback strategies. A failed import may potentially leave the system in an inconsistent state. Hence, its highly recommend to run a precheck/dry-run before the import to validate the package. See HTTP POST /api/packages/validate for more details. This will help catch most of the errors upfront.


Consequently, in release 1.3.1 we have added a new function Test-vRAContentPackage and included it by default in Import-vRAContentPackage. This should mitigate any issues with Importing a badly crafted Content Package, but you should of course test this before using in Production……


A while back I wrote a post “Create Blueprints in vRA 7 via REST and via vRO” , with some details around automating vRA 7 Blueprint creation. Since that time Craig and I have published PowervRA, but in the initial releases I had some difficulty with providing the same functionality for PowerShell as I had in the previous post with REST and vRO.

However, thanks to some Ninja skills from Craig in our other project PowervRO, I was able to take the work we did over there around importing vRO Packages / Workflows etc via PowerShell and re-use it in PowervRA for a new function Import-vRAContentPackage, which is available in the latest release of PowervRA 1.3.0. Previous releases contained: New, Get, Remove and Export-vRAContentPackage, we just did not have the last piece of the puzzle: Import.


So here’s an example of how it works. In the previous article I showed how vRA Blueprints are bundled up into Content Packages and then exported to a zip file containing YAML files which each describe the Blueprints. We need to do the same when automating this process with PowerShell and PowervRA, so first of all we need to know the IDs of any Blueprints to add to the Content Package:

Now we create a Content Package containing the Id of the centos Blueprint that we wish to export:

and then export that Content Package to a zip file:

Take a look at the contents of the zip file and you will see that it contains a metadata.yaml file and a yaml file per Blueprint in a folder composite-blueprint:



Take a look at the contents of the centos.yaml file to see how a Blueprint is described:

Now we can either modify that file and import back into the same system if we want to change the existing Blueprint or we can take it further if we want to create more Blueprints. Let’s say we want to add a second, similar Blueprint. All we need to do is copy the existing centos.yaml file, make changes in it (I’ve just given larger CPU and memory values), then update the metadata.yaml file to reference the extra file. So they would end up like this:



Now create a new zip file containing the updated metadata.yaml file and the two blueprint yaml files:


We can then import that zip file into a vRA Tenant. I’m going to use the Tenant that currently contains no Blueprints:


Import the content package:

and we have two Blueprints 🙂


centosb has those higher resource settings of 2CPUs and 2048MB memory which we changed in the yaml file: