Category Archives: Azure

Deploying ARM Templates with Artifacts Located in a Private GitHub Repository

Written by Tao Yang

Background

I have spent the last few days authoring an Azure Resource Manager (ARM) template. The template is stored in a private GitHub repository. It contains several nested templates, one of which deploys an Azure Automation account with several runbooks. For the nested templates and automation runbooks, the location must be a URI. Therefore the nested templates and the Azure Automation runbooks that I wish to deploy in the ARM templates must be located in location that is accessible by Azure Resource Manager. There are many good examples in the Azure Quickstart Template GitHub repository, for example, in the oms-all-deploy template, these artifacts are being referenced as URIs pointing to the raw content in GitHub:

Nested Templates:

image

image

Azure Automation runbook:

image

As you can see from the examples above, the template is referencing these artifacts as the raw file content in github (located in https://raw.githubusercontent.com). This method works well when your templates and artifacts are stored in a public GitHub repository, but it won’t work when they are stored in a private repository. For example, if I click on the “Raw” button for a file stored in a private repo like shown below:

image

you will see a token was added to the raw file content URI:

image

This token was generated for your current session once you’ve logged in to GitHub. Although as most of you guys already know that you can generate Personal Access token in GitHub, you can not replace this token in the URL with the Personal Access Token you have generated. The personal access token must be added as part of the authorization header in the HTTP request, and you also need to add another header “Accept”: “application/vnd.github.VERSION.raw”. Obviously we cannot add HTTP headers in ARM templates, therefore we cannot reference artifacts located in a private GitHub repositories out of the box. Someone has already started a thread in the MSDN forum: https://social.msdn.microsoft.com/Forums/sqlserver/en-US/54876f00-4bf4-4d84-81af-e7a1128ac6f7/linking-arm-templates-in-private-github-repo?forum=windowsazurewebsitespreview. the workaround they came up with was building a HTTP proxy server and add the required headers in the proxy server. This to me seems awfully complicated and it also brings limitations that you can only deploy the templates in the network that you can access such a proxy server.

Solution

I spent most of my day yesterday trying to figure out if there is a way to deploy artifacts located in a private GitHub repository and I couldn’t find a way to use the Personal Access Token as a parameter in the HTTP GET request. As I accepted the fact that I wouldn’t be able to do it and started removing all nested templates in my project, I had a light bulb moment when I was driving to the supermarket yesterday afternoon, and the idea I came up with is super simple:  since I cannot pass the personal access token in the URL, I can just write a very simple “proxy” Azure Function app that accept the GitHub personal access token from the URL parameter, then construct the HTTP header, make the request to GitHub, and return the HTTP response it received from GitHub. Once this function is written, we can use the URI to the function as the artifact location instead of the GitHub URI.

GitHubPrivateRepoFileFetcher Azure Function

I’ve written something similar before, so this HTTP Trigger C# Azure function app literally took me 10 minutes to write. I named this function GitHubPrivateRepoFileFetcher:

This function requires 2 parameters to be passed in from the URL:

  • githuburi – the original URI to the raw GitHub file content
  • githubaccesstoken – the GitHub personal access token you have generated

Since the function does not contain any sensitive information, and I’d like to ensure the function URI is not too long, I have configured the function Authorization level to Anonymous so I don’t have to use an authorization code to invoke the Azure Function.

image

To call this function, you need to construct the URL as the following:

https://<Function App Name>.azurewebsites.net/api/GitHubPrivateRepoFileFecher?githuburi=https://raw.githubusercontent.com/<GitHub User Name>/<Repository>/<branch>/<path to the file>&githubaccesstoken=<GitHub Person Access Token>

i.e.

https://myfunctionapp.azurewebsites.net/api/GitHubPrivateRepoFileFecher?githuburi=https://raw.githubusercontent.com/tyconsulting/TestPrivateRepo/master/DemoNestedTemplates/azuredeploy.json&githubaccesstoken=e82dc3df60b92147c81a9924042da8d7f0bc78c8

GitHub Personal Access Token

Before start using the GitHubPrivateRepoFileFetcher function, you will firstly need to generate a GitHub personal access token if you don’t already have one. The token must have access to repo as shown below:

image

Constructing ARM Templates

In the ARM templates, define the following parameters (and define the default value to match your environment):

define the linked template URL in the variables section:

For the Azure Automation runbook that I am planning to deploy, define the URI to the script in the variables section similar to the previous example:

For the Azure Automation runbook resource, use the variable defined above in the script uri property:

GitHub README.md

Most of the ARM template repositories contains the “Deploy to Azure” and “Visualize” buttons as shown below:

image

To add the “Deploy to Azure” button to your README.md markdown file, you will need to add the following code to the markdown (modify the URL to suit your environment):

For the “Visualize” button, add the following code in the markdown:

Basically, add the Azure Function URL with required parameters as the parameter to the azure custom deployment and armviz.io URIs. you will need to encode the function URI. you can use an an online encoder utility such as http://www.url-encode-decode.com/ to encode the original Azure Function URI.

When you click the “Deploy to Azure” button, you will be redirected to the Azure portal:

image

When the deployment is completed, the resources defined in the ARM templates are created:

image

Conclusion

By using the “proxy” Azure Function GitHubPrivateRepoFileFetcher, you can easily retrieve the content of a file in private GitHub repo without having to use custom headers in HTTP request. This “proxy” Azure Function is generic that you can use for any Azure subscriptions and any GitHub private repositories. If you regularly use GitHub private repositories for ARM templates, I strongly recommend you to create such a Azure function to assist you with your deployments.

If you have any questions or recommendations, please feel free to contact me.

Using Postman Invoking Azure Resource Management APIs

Written by Tao Yang

SNAGHTML21e56dffWhen working with REST APIs, Postman (https://getpostman.com) is a popular tool that needs no further introductions. This week, I’ve been pretty busy working on the upcoming Inside OMS V2 book, and I’m currently focusing on the various OMS REST APIs for the Custom Solutions chapter. I want to use Postman to test and demonstrate how to use the OMS REST APIs. Since most of the ARM based APIs requires oAuth token in the authorization header, I needed to configure Postman to contact Microsoft Graph API in order to generate the token for the API calls.

Initially, I thought this would be very straightforward and should have been done by other people in the past. I did find several posts via my favourite search engine, which were good enough to get me started, but I was not able to find one that explains how to configure Postman to request for the token natively without using external processes to generate the token. Therefore, I’m going to document what I have done here so I can reference this post in the Inside OMS book Smile.

Note: I’m using the Windows desktop version of Postman, the UI may be slightly different than the Chrome extension version.

Step 1: Create an Azure AD application and service principal for Postman.

I have automated the creation process using a PowerShell script shown below:

image

Step 2: Grant ‘Postman’ application permission to the Windows Azure Service Management API.

Note: steps demonstrated below MUST be completed in the Azure classical portal. Based on my experience, I was not able to give the Azure AD application permission to “Windows Azure Service Management API” from the new ARM portal.

Once the ‘Postman’ Azure AD application is created, logon to the Azure classical portal (https://manage.windowsazure.com), browse to Azure AD, select the directory of where the application is created, then go to Applications, show “Applications my company owns”, locate the “Postman” application.

image

Click the “Postman” application, go to “Configure” tab, and click “Add Application”.

image

Add “Windows Azure Service Management API”

image

Tick “Access Azure Service Management as Organization users” under the “Delegated Permissions” drop down list and then click on “Save”.

image

Step 3: Configure Authorization in Postman.

In Postman, enter an URI for an ARM REST API call, in this example, I’ll use the OMS REST API to retrieve a list of workspaces. Here’s the URI I’m using: https://management.azure.com/subscriptions/{subscription id}/providers/microsoft.operationalinsights/workspaces?api-version=2015-03-20

Make sure the HTTP method is set to “GET”, and then click on Authorization. For the “Type” drop down list, select OAuth 2.0

image

Click on “Get New Access Token”.

image

Enter the following information in the “Get New Access Token” popup window:

Token Name: AAD Token

Auth URL: https://login.microsoftonline.com/common/oauth2/authorize?resource=https%3A%2F%2Fmanagement.azure.com%2F

Access Token URL: https://login.microsoftonline.com/common/oauth2/token

Client ID: <output from the script in Step 1>

Client Secret: <output from the script in Step 1>

Grant Type: Authorization Code

Make sure “Request access token locally” checkbox is unchecked.

image

Click on “Request Token”, you will get the Azure AD sign-in page, enter the credential of an Organization account, – based on my experience, Microsoft accounts (i.e. @outlook.com) do not always work.

image

If everything goes as planned, you will see a new token been generated. you need to select the token, and add it to the request header:

image

Now, go to the “Headers” tab, you should see the Authorization header:

image

You may need to add additional headers for the request depending on the requirements of the API, but in this case, I don’t need any, I can just call the API by clicking on Send button.

Now I can see all my OMS workspaces in this particular subscription listed in the response body.

image

This concludes the work through, please do let me know if you are running into issues.

Lastly, This post has been a great help for me (although it’s for Dynamic CRM, not for Azure, it did point me to the right direction): https://blogs.msdn.microsoft.com/devkeydet/2016/03/22/using-postman-with-azure-ad/

Be Cautious When Designing Your Automation Solution that Involves Azure Automation Azure Runbook Workers

Written by Tao Yang

caution-signOver the last few weeks, it occurred to me twice that I had to change my original design of the automation solutions I was working on because of the limitations of Azure Automation Azure Runbook Workers. Last month, my fellow CDM MVP Michael Rueefli has published an article and explained Why deploying Hybrid Runbook Workers on Azure makes sense. In Michael’s article, he listed some infrastructural differences between Azure runbook workers and the Hybrid runbook workers. However, the issues that I faced that made me to change my design were caused by the functional limitations in Azure runbook workers. Therefore I thought I’d post a supplement post to document my findings. Since these limitations are not documented in Microsoft’s official documentation site, please DO NOT assume this is the complete list, and it is still valid by the time you read it. I will try my best to keep this post up-to-date over the time. So, there are the limitations I found:

01. Windows Event Log Service is missing

Few months ago, I wrote a runbook that reads event log export .evt files and inject the records to OMS (http://blog.tyang.org/2016/12/05/injecting-event-log-export-from-evtx-files-to-oms-log-analytics/). This runbook uses a .NET class System.Diagnostics.Eventing.Reader.EventLogQuery to read events from the evt files. Few days ago, when I tried to implement a version of this runbook for a solution that I was initially planned to use Azure runbook workers, the runbook job failed when configured to run on Azure and I got a “RPC Server is unavailable” error.

image

After some troubleshooting, I found the cause of this error. The .NET class EventLogQuery relies on the Windows Event Log service to read the evt files, but in the Azure runbook worker’s sandbox environment, Windows Event Log service does not exist. Therefore, I have no choice but changing the design the having this runbook to be executed on the Hybrid runbook worker.

02. Unable to Map Network Drives

In a runbook, I have a code block that maps a network drive to an Azure storage File Share and read some files. When ran it on the Azure runbook worker, I found it was not possible to map the network drive. Luckily I could use the Azure.Storage PowerShell module to access the files in the Azure File Share, so I had to update the runbook to accommodate this limitation.

03. Disk Size Limitation

In a runbook, I needed to extract a 2GB zip file. It failed to run on Azure runbook worker because of the insufficient disk space and I couldn’t even copy the 2GB file to the Azure runbook worker. I attempted to find the size of the system drive on the Azure runbook workers by using Get-PSDrive cmdlet but it did not return the disk size.

04. Unable to use Service Principals and Credentials to Login to Azure

In a solution that I was working on, I designed to use Service Principals (Azure AD applications) and credentials to login to Azure. This method worked perfectly when the runbook was executed on the Hybrid runbook worker, but does not work when running on Azure. After some researching, I found someone had the same issue and posted on Stack Overflow: http://stackoverflow.com/questions/37619377/login-azurermaccount-credential-immediately-expiring. Basically, using credentials with service principals is not supported when the runbook is executed on Azure runbook workers.

Based on my experiences so far, I’ve come to a conclusion that I should not automatically assume everything is going to work on Azure runbook workers when designing my solution. In future, I will make sure that I’ll test early and every step along the way if I am planning to have the runbook executed on Azure runbook workers.

If you have experienced limitations that are not listed here, please let me know and I’m happy to add them onto this post.

Using Azure Key Vault as the Password Repository For You and Your Team

Written by Tao Yang

Over the past decade, I have used several password management applications such as Password Safe, KeePass and LastPass. Out of these products, only LastPass is cloud based. I have been hesitate to use LastPass over the last few years and stayed with KeePass because of the LastPass data breach back in 2015. Few months ago, my friend Alex Verkinderen finally convinced me to start using LastPass again. But this time, in order to be more secure and being able to use Multi-Factor Authentication (MFA), I have purchased a premium account and also purchased a YubiKey Neo for MFA. I understand not everyone is willing to spend money on password repository solutions (in my case, USD $12 per year for the LastPass Premium account and USD $50 + shipping for a Yubikey Neo from Amazon). Also, based on my personal experience, there are still many organisations that don’t have a centralised password repositories. Many engineers and consultants I have met still store passwords in clear text.

On the other hand, Azure Key Vault has drawn a lot of attention since it was released and it is become really popular. I have certainly used it a lot over the last few months and managed to integrate it with many solutions that I have built.

AzureKeyVaultPasswordRepo PowerShell Module

I spent few hours last night and today, developed a PowerShell CLI menu based app based on few existing scripts I wrote in the past. This app allows you to create, manage Azure Key Vault and use it as your personal (or team’s) password repository. In order to simplify the process of deploying and using this app, I wrapped it in a PowerShell module. I named this module AzureKeyVaultPasswordRepo and it is now available on both PowerShell Gallery and GitHub:

PowerShell Gallery: https://www.powershellgallery.com/packages/AzureKeyVaultPasswordRepo/

GitHub: https://github.com/tyconsulting/AzureKeyVaultPasswordRepo-PSModule/releases/tag/1.0.0

If you are running PowerShell version 5 and later, you can install this module using an one-liner:

Install-Module AzureKeyVaultPasswordRepo

Once it is installed, you can launch the app either using the full name Invoke-AzureKeyVaultPasswordRepository, or use one of the 2 shorter aliases (ipr and Start-PasswordRepo).

Initial Setup

This module requires AzureRm.Profile, AzureRm.Resources and AzureRm.KeyVault modules, which you can also find from the PowerShell Gallery.

When it is launched, it will detect if you are currently Signed in to Azure and ask you if you want to keep using the same account if you are currently signed in.

image

you have the option to keep using the current account or sign in to Azure using another account.

Then the app will prompt you to use the current Azure subscription that’s set in the context, or select another subscription from the list.

When running it for the first time, you will need to create a new Key Vault from the menu. You can choose an existing resource group, or create a new resource group in your azure region of your choice

image

Once the key vault is created, you will need to assign full access to an Azure AD account. This is done by searching Azure AD using a search string and select an user account from the search result list.

image

Once the permission is assigned, everything is ready to go. you will be presented with the main menu:

image

Note: It is by design that this app does not use any existing key vaults that you may already have in your subscription. You have to create a new one. Any existing key vaults that are not created by this app will not appear on the list for you to choose.

Creating Profile to store settings

In order to make you access this key vault as fast as possible in the future, the first thing I’d suggest you to do is to select option 4 and save the Azure subscription Id and Key Vault name in your profile. this profile is stored in Windows Registry under HKEY_CURRENT_USER\SOFTWARE\TYConsulting\AzureKeyVaultPasswordRepo\Profiles\<your Azure account name>.

image

image

Once the profile is saved, when you launch this app next time, it will automatically use the Azure subscription and the Key Vault that’s stored in the profile.

SNAGHTML6c6b2b3

Creating Credentials

From the main menu, you have the option to:

1. Create new credential (user name and password). you also have the option to generate random password by not entering a password. if you choose to use this app to generate a random password, the password will be copied to the computer’s clipboard once the credential is created (so you can use Ctrl-V to paste it to wherever you need to).

image

List, Retrieve, Update and Delete Credentials

You can use Option 2 to list, retrieve, update and delete existing credentials. When option 2 is selected, the app will list all credentials stored in the key vault, and from there, you can choose the credential from the list that you are interested in. Once the credential is selected, you have the option to:

  1. Copy user name to clipboard
  2. Copy password to clipboard
  3. Update credential (username / password)
  4. Delete Credential

image

Search Credential

Instead of selecting a credential to manage from the list, you can also search credentials (based on credential name) using option 3.

Save / Delete Profile

As shown previously, for faster access, you can use option 4 to store the Azure subscription Id and the Key Vault Name in the registry. if you decide to delete the profile (i.e. when you decide to use another subscription or key vault), you can use option 5 to delete the existing profile from registry.

Managing Key Vault Access

You can use option 6 to grant full access to the key vault to other Azure AD accounts, or use option 7 to remove access.

Conclusion

Personally I’m pretty happy to see what I have produced during such a short period of time (only few hours based on some existing scripts I wrote few weeks ago). I think this would fill a gap for people and organisations that do not have a commercial password management solution.

Azure Key Vault is a very in-expensive solution, and by using an Azure offering, you automatically inherit the MFA solutions that you have configured for Azure / Azure AD. i.e. I’m not using Azure AD premium for my lab but for my Microsoft (@outlook.com) account, I have enabled MFA using the Microsoft Authenticator app. Therefore in order to access the Key Vault using this module, I will need to use MFA during the sign in process.

I’ve only spent few hours on this PowerShell module, there are still room for improvement. So consider this as a MVP (Minimum Viable Products). I think the following additions would be beneficial in future releases (if I decide to have develop further):

  • A GUI interface
  • Support additional types of sensitive information, not just username and passwords
  • Support Service Principal (Azure AD Applications)
  • Support different levels of access (currently everyone has full access)

Lastly, please give it a try, and I’d like to hear back from the community. If you are interested to learn how to interact with Key Vault using PowerShell, feel free to read the source code of this module. if you have any questions or suggestions, please feel free to contact me!

Managing Azure Automation Module Assets Using MyGet

Written by Tao Yang

SNAGHTML2756703

Background

Managing the life cycle of PowerShell module assets in your Azure Automation accounts can be challenging. If  you are currently using Azure Automation, you may have already noticed the following behaviours when managing the module assets:

1. It is difficult to automate the module asset deployment process.

If you want to automate the module deployment to your Automation Account (i.e. using the PowerShell cmdlet New-AzureRmAutomationModule), you must ensure the module that you are trying to import is zipped into a zip file and located on a public location where Azure Automation can read via HTTP (i.e. Azure Blob storage). In my opinion, this is over complicated.

2. Modules are not deployed to the Hybrid Workers automatically

If you are using Hybrid Workers, you must also manage the modules separately. Unlike Azure runbook workers, Azure Automation does not automatically deploy modules to Hybrid Workers. This means when you import a module to your Azure Automation account, you must also manually deploy it to your Hybrid Worker computers.

3. Difficult to maintain module version consistencies.

Since managing modules in your Azure Automation accounts and hybrid workers are two separate processes, it is hard to make sure the versions of your module assets are consistent between your automation account and hybrid worker computers.

Over the past few months, I have invested lot of my time on MyGet and looking for ways to close these gaps. Few months ago, I have released a PowerShell DSC Resource module called cPowerShellPackageManagement (http://blog.tyang.org/2016/09/15/powershell-dsc-resource-for-managing-repositories-and-modules/). By using this DSC resource module, we can easily develop DSC configurations for computers (such as Hybrid Workers) to automatically install modules from a PowerShell module repository (i.e. a MyGet feed). This approach closes the gaps of managing Hybrid Worker computers (item #2 on the list above). Today, I am going to discuss how we can tackle item #1 and #3. Before I start talking about my solutions, let me quickly introduce MyGet first.

What is MyGet?

Myget (www.myget.org) is a SaaS based package repository hosted on the cloud. It supports all the popular package providers such as NuGet, Npm etc. It can host both private and public repository (called a feed) for you or your organisation.

If you come from a developer or DevOps background, you may have already heard about MyGet in the past, or have used similar on-premises package repositories (such as ProGet). If you are an IT Pro, since you are reading this blog post right now, you must be familiar with PowerShell, therefore must have heard or used PowerShell Gallery (https://powershellgallery.com). You can use MyGet the same way as PowerShell Gallery in PowerShell version 5 and later, except you have absolute control of the content in your feeds. Also,  if you are using a paid MyGet account, you can have private feeds and you can control the access by issuing API keys. You can also create multiple feeds that contain different packages (PowerShell modules in this case). i.e. if you develop PowerShell modules, you can have a Dev feed for you to use during development, and also Test and Production feeds for testing and production uses.

Why Do I Need MyGet?

You may be a little bit hesitate to use PowerShell Gallery because it is 100% public. As a regular user like everyone else, you can only do very little. i.e. you can publish modules to PowerShell gallery, but you can’t guarantee your modules will stay there forever. Microsoft may decide to un-list your modules if they find problems with it (i.e. failed to comply with the rules set in the PSScriptAnalyzer). You also don’t have access to delete your modules from PowerShell Gallery. You can un-list your modules, but they are still hosted there. To me, PowerShell Gallery is more like a community platform that allows everyone to share their work, but you should not use it in any production environments because you don’t have any controls on the content, how can you make sure the content you need is going to be there tomorrow?

MyGet allows you to create feeds that you have total control, and as I mentioned already, with a paid MyGet account, you can have private feeds to host your IPs that you don’t want to share with the rest of the world.

MyGet also ships with other awesome features, such as Webhook support.

Automating Module Deployment to Automation Account

I have developed a runbook that retrieves a list of modules from a repository (i.e. your MyGet feed), and import each module to the Automation account of where the runbook resides, if the module does not exist or the version is lower than the latest available version from the module repository. Before importing, the runbook also tries to work out the module dependencies and import required modules in groups (i.e. the modules without dependencies are imported first).  Here’s the runbook source code:

Note: this runbook does not download and zip up PowerShell modules from the repository feed. Instead, it construct the URI to the underlying NuGet package and import the package directly to your automation account.

In order to use the runbook, you will need to create a automation variable first.

Name: ModuleFeedLocation

Value: <the source location URI to your repository feed>

image

Note: if you are not sure what is the source location URI for your feed, check out this help document from MyGet website: http://docs.myget.org/docs/how-to/publish-a-powershell-module-to-myget. However, I don’t believe the documentation is 100% accurate. Based on my experience, no matter if you are using private or public feeds, the Source location URI should be:

https://www.myget.org/F/<feed-name>/auth/<api-key>/api/v2

The API key is available on the MyGet portal:

image

if  you have connected to the feed as a PowerShell repository, you can also check using Get-PSRepository cmdlet:

image

Other than the automation variable, you will also need to make sure you have the AzureRunAsConnection connection asset and associated certificate asset created. these assets are created automatically by default when you created your Azure Automation account:

image

If you don’t have this connection asset, you can manually create it using PowerShell – this process is documented here: https://docs.microsoft.com/en-au/azure/automation/automation-sec-configure-azure-runas-account

Once the runbook and all required assets are in place, you will also need to create a webhook for the runbook. It is OK to configure the webhook to target Azure workers (although targeting hybrid worker group is also OK, but not necessary).

image

Once the webhook is created, go to MyGet portal, go to your feed then go to the Webhook section and add a HTTP POST webhook

image

then enter a description and paste the runbook webhook URL. for the webhook trigger, only tick “Package Added”:

image

image

Once the webhook trigger is created, everything is good to go. when next time you add a PowerShell module or update an existing module on your MyGet feed, it will automatically trigger the Azure Automation runbook, which will find the modules need to be imported and updated, and attempt to import them one a time.

image

Tips:

Once you have configured your MyGet feed as a PowerShell repository on a computer running PowerShell v 5 or later, you can publish modules located on your local computer to the feed using Publish-Module cmdlet. You can also configure MyGet to get modules from another repository such as PowerShell Gallery. I have blogged this previously: http://blog.tyang.org/2016/09/20/pushing-powershell-modules-from-powershell-gallery-to-your-myget-feeds-directly/

If you want to configure multiple Automation accounts to sync with a single MyGet feed, you can simply create the runbook and required assets in each automation account, and add a webhook trigger for each instance of the runbook within your MyGet feed.

Things to Watchout

there are few things that you need to watch out when using this solution:

1. be aware of the limitations in Azure Automation

Some of these limitations may impact your module imports. you can find the official documentation here:  https://docs.microsoft.com/en-us/azure/azure-subscription-service-limits#automation-limits

2. Unlike any NuGet repositories such as PowerShell Gallery and MyGet, Azure Automation does not support storing different versions of same module

This may cause some of the module imports to fail. For example, if you have a module called ModuleA (version 1.0) that is a dependency to ModuleB version 1.0. You have ModuleA 1.0, ModuleB 1.0 and 2.0 in your MyGet repository, the runbook will firstly import ModuleB 2.0 to your automation account first. then when it tries to import ModuleA 1.0, it may fail because it does not pass the validation test (by importing ModuleA 1.0 on the runbook worker computer). so prior to committing these kind of packages to a feed that’s being used by Azure Automation, make sure you test it first on another feed, and make sure you can successfully install and import the module on your local computer.

3. Do not load too many modules to the feed initially

Module import into Azure Automation account takes a lot of time. when running a runbook job on Azure workers, the runbook can run maximum 3 hours due to its fair share policy. so if you have a lot of modules to load in the beginning, you need to make sure the runbook job can be completed within 3 hours. or you may have to rerun the runbook to pickup the modules didn’t get imported in the previous runbook job. Alternatively, you can configure the runbook to run on a Hybrid Worker group, because the fair share policy does not apply when the job is being executed on hybrid workers.

Conclusion

If you use a dedicated MyGet feed to host all required modules for Azure Automation, you can use the cPowerShellPackageManagement DSC resource module I mentioned earlier in this blog post to automate the module deployment to Hybrid Workers. In the same time, by using the method described in this blog post, you have also got the Automation account covered.

Therefore, if you have both DSC configured for Hybrid Workers (i.e using Azure Automation DSC), and have this runbook and webhook configured, by adding a new package to your MyGet feed, your entire Azure Automation infrastructure is updated automatically.

My MVP buddy Alex Verkinderen also also done some interesting integration between MyGet and PowerShell Gallery. He is going to publish his innovation on his blog (http://www.mscloud.be/) soon, so make sure you subscribe to his blog Smile.

Lastly, thanks Alex for testing the runbook for me, and if anyone has any questions or suggestions, please feel free to contact me.

PowerShell Script to Import and Update Modules from PowerShell Repositories to Azure Automation

Written by Tao Yang

PowerShell Gallery has a very cool feature that allows you to import modules directly to your Azure Automation Account using the “Deploy to Azure Automation” button. However, if you want to automate the module deployment process, you most likely have to firstly download the module, zip it up and then upload to a place where the Azure Automation account can access via HTTP. This is very troublesome process.

I have written a PowerShell script that allows you to search PowerShell modules from ANY PowerShell Repositories that has been registered on your computer and deploy the module DIRECTLY to the Azure Automation account without having to download it first. You can use this script to import new modules or updating existing modules to your Automation account.

This script is designed to run interactively. You will be prompted to enter details such as module name, version, Azure credential, selecting Azure subscription and Azure Automation account etc.

The script works out the URI to the actual NuGet package for the module and import it directly to Azure Automation account. As you can see from above screenshot, Other than the PowerShell Gallery, I have also registered a private repository hosted on MyGet.org, and I am able to deploy modules directly from my private MyGet feed to my Azure Automation account.

If you want to automate this process, you can easily make a non-interactive version of this script and parameterize all required inputs.

So, here’s the script, and feedback is welcome:

PowerShell Module for Managing Azure Table Storage Entities

Written by Tao Yang

Azure Storage - TableIntroduction

Firstly, apologies for not being able to blog for 6 weeks. I have been really busy lately.  As part of a project that I’m working on, I have been dealing with Azure Table storage and its REST API over the last couple of weeks. I have written few Azure Function app in C# as well as some Azure Automation runbooks in PowerShell that involves inserting, querying and updating records (entities) in Azure tables. I was struggling a little bit during development of these function apps and runbooks because I couldn’t find too many good code examples and I personally believe this REST API is not well documented on Microsoft’s documentation site (https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/table-service-rest-api). Therefore I have spent the last two days developed a PowerShell module for managing the lifecycle of the Azure Table entities. This module can be used to perform CRUD (Create, Read, Update, Delete) operations for Azure Table entities.

AzureTableEntity PowerShell Module

This PowerShell module is named as AzureTableEntity, it can be located in both GitHub and PowerShell Gallery:

This module offers the following 4 functions:

Get-AzureTableEntity Search Azure Table entities by specifying a search string.
New-AzureTableEntity Insert one or more entities to Azure table storage.
Update-AzureTableEntity Update one or more entities to Azure table storage.
Remove-AzureTableEntity Remove one or more entities to Azure table storage.

Note: All functions have been properly documented in the help file. you can use Get-Help cmdlet to access the help file.

Get-AzureTableEntity

By default when performing query operation, the Azure Table REST API only returns up to 1000 entities or all entities returned from search within 5 seconds. This function has a parameter ‘-GetAll’ that can be used to return all search results from a large table. The default value of this parameter is set to $true.

The search result returned by the search API is deserialised. As the result, complex data type such as datetime is returned as string. If you want any datetime fields from the search result returned as original datetime field, you can set the “-ConvertDateTimeFields” parameter to $true. Please note this would potentially increase the script execution time when dealing with a large set of search result.

Hint: You can easily build your search string using the Azure Storage Explorer.

New-AzureTableEntity

This function can be used to insert a single entity or bulk insert up to 100 entities (and the total payload size is less than 4MB).

Please make sure both “PartitionKey” and “RowKey” are included in the entity. The data type for these fields must be string.

i.e. Instead of setting RowKey = 1, you should set RowKey = “1” – because the value for both PartitionKey and RowKey must be a string.

Update-AzureTableEntity

This function can be used to update a single entity or bulk update up to 100 entities (and the total payload size is less than 4MB).

Please note when updating an entity, all fields (including the fields that do not need to be updated) must be specified. It is actually a merge operation. If you are modifying an existing entity returned from the search operation (Get-AzureTableEntity) and the entity contains datetime fields, please make sure you set “-ConvertDateTimeFields” parameter to $true when performing the search in the first place. Please also be aware that the built-in Timestamp field must not be included in the entity fields.

Remove-AzureTableEntity

This function can be used to remove a single entity or bulk remove up to 100 entities (and the total payload size is less than 4MB).

Support for Azure Automation and SMA

To simply leveraging this module in Azure Automation or SMA, I have included a connection object in the module:

image

Once you have created the connection objects, instead of specifying storage account, table name and storage account access key, you can simply specify the connection object using ‘-TableConnection’ parameter for all four functions.

Sample Code

I have published some sample code I wrote when developing this module to GitHub Gist:

Summary

I wrote this module so I can simplify my Azure Automation runbooks and make IT Pro’s life easier when working on Azure Table storage. If you have to deal with Azure Table storage, I hope you find this module useful. If you are a developer and looking for code samples, you can still use this module and simply translate the code to the language of your choice.

I purposely didn’t include any functions for managing the Azure table storage itself because you can manage the Table storage using the Azure.Storage module.

Lastly, feedbacks are always welcome, so please drop me an email if you have any.

Feeding Your Power BI Reports from Azure Functions

Written by Tao Yang

Background

Few days ago my good friend and fellow CDM MVP Alex Verkinderen (@AlexVerkinderen) had a requirement to produce a Power BI dashboard for Azure AD users. so Alex and I started discussing a way to produce such report in Power BI. After exploring various potential possibilities, we have decided to leverage Azure Functions to feed data into Power BI. You can check out the Power BI solution Alex has built on his blog here: http://www.mscloud.be/retrieve-azure-aad-user-information-with-azure-functions-and-publish-it-into-powerbi

In this blog post, I’m not going to the details of how the AAD Users Power BI report was built. Instead, I will focus on the Azure Functions component and briefly demonstrate how to build a Azure Functions web service and act as a Power BI data source. As an example for this post, I’ll build a Azure Functions web service in PowerShell that brings in Azure VMs information into Power BI. To set the stage, I have already written two blog posts yesterday on Azure Functions:

These two posts demonstrated two important steps that we need to prepare for the Azure Functions PowerShell code. We will need to follow these posts and prepare the following:

  • Upload the latest AzureRM.Profile and AzureRM.Compute PowerShell modules to Azure Functions
  • Encrypt the password for the service account to be used to access the Azure subscription.

Once done, we need to update the user name and the encrypted password in the code below (line 24 and 25)

I have configured the function authorization level to “Function” which means I need to pass an API key when invoking the  function. I also need to pass the Azure subscription Id via the URL. To test, I’m using the Invoke-WebRequest cmdlet and see if I can retrieve the Azure VMs information:

As you can see, the request body content contains a HTML output which contains a table for the Azure VM information

image

Now that I’ve confirmed the function is working, all I need to do is to use Power BI to get the data from the web.

Note: I’m not going to too deep in Power BI in this post, therefore I will only demonstrate how to do so in Power BI desktop. However Alex’s post has covered how to configure such reports in Power BI Online and ensuring the data is always up-to-date by leveraging the On-Prem Data Gateway component. So, please make sure you also read Alex’s post when you are done with this one.

image

In Power BI Desktop, simply enter the URL with the basic setting:

image

and choose “Table 0”:

image

Once imported, you can see the all the properties I’ve defined in the Azure Functions PowerShell script has been imported in the dataset:

image

and I’ve used a table visual in the Power BI report and listed all the fields from the dataset:

image

Since the purpose of this post is only to demonstrate how to use Azure Functions as the data source for Power BI, I am only going to demonstrate how to get the data into Power BI. Creating fancy reports and dashbaords for Azure VM data is not what I intent to cover.

Now that the data is available in Power BI, you can be creative and design fancy reports using different Power BI visuals.

Note: The method described in this post may not work when you want to refresh your data after published your report to Power BI Online. You may need to use this C# Wrapper function: http://blog.tyang.org/2016/10/13/making-powershell-based-azure-functions-to-produce-html-outputs/. Alex has got this part covered in his post.

Lastly, make sure you go check out Alex’s post on how he created the AAD Users report using this method. As I mentioned, he has also covered two important aspects – how to make this report online (so you can share with other people) and how to make sure you data is always up to date by using the on-prem data gateway.

Making PowerShell Based Azure Functions to Produce HTML Outputs

Written by Tao Yang

Over the last few weeks, I’ve been working with my MVP buddy Alex Verkinderen (@AlexVerkinderen) on some Azure Function related stuff. We have both written few PowerShell based functions that output a HTML page.

These functions use the ConvertTo-HTML cmdlet to produce the HTML output. For example, here’s a simple one that  list 2 cars in a HTML table:

Today we ran into an issue while preparing for our next blog posts, after some diagnostics, we realised the issue was caused by the HTML output returned from the PowerShell based functions.

If I use Invoke-WebRequest cmdlet in Powershell to trigger this PowerShell function, I am able to get the HTML output in the request output content and everything looks good:

image

However, if we simply invoke this function from a browser, although the output is in HTML format, the browser does not display the HTML page. it displays the HTML source code instead:

image

after some research, we found the cause of this issue – the content type returned by the PowerShell function is always set to “text/plain”:

image

I suspect this is because for PowerShell based functions, we have to output to a file ($res variable by default). I have tried to construct a proper HTTP response message (System.Net.Http.HttpResponseMessage), but it didn’t work in the PowerShell functions. Based on my testing results, it seems PowerShell functions cannot handle complex types.

Luckily I found this post and it pointed me to the right direction: http://anthonychu.ca/post/azure-functions-serve-html/. According on this post, we can certainly serve out a proper HTML page in C# based functions.

I don’t really want to rewrite all my PowerShell functions to C#, not only because I don’t want to reinvent the wheels, but also I want to keep using the PowerShell modules in those existing functions. In the end, I came up with C# based “wrapper” function. I named this function HTTPTriggerProxy:

This C# based HTTPTriggerProxy function simply takes the URL you have specified, get the response and wrap it in a proper HTTPResponseMessage object. All you need to do is to specify the original URL that you want to request in the “RequestURL” parameter as part of the wrapper function URL:

https://<Your Azure Function Account>.azurewebsites.net/api/HttpTriggerProxy?code=<Access code for Http Trigger Proxy function>&RequestURL=<Your original request URL>.

Now if I use this wrapper to invoke the sample GetCars PowerShell function, the HTML page is displayed in the browser as expected:

image

and you can see the content type is now set as “text/html”:

image

Note:

  • This wrapper function only supports the Get HTTP method. The Post method is not supported so you can only pass the RequestURL in the the wrapper URL (as opposed to placing it in the request body). I didn’t bother to cater the POST method in this function because what we are going to use this for only supports HTTP Get method.
  • if your original request requires authentication, then this is not going to work for you.
  • If you original URL contains the ampersand character (“&”), please replace it with “%26”. for example, if your original request is https://myazurefunction.azurewebsites.net/api/GetCars?code=rgpxmm0p87fh2z1wd0a6vargfxxogb6cf&colour=red, then you need to change it to https://myazurefunction.azurewebsites.net/api/GetCars?code=rgpxmm0p87fh2z1wd0a6vargfxxogb6cf%26colour=red

Lastly, this is just something we came up today while making another set of posts. Please stay turned. our new posts will be published in the next day or two.

Securing Passwords in Azure Functions

Written by Tao Yang

09/10/2016 – Note: This post has been updated as per David O’Brien’s suggestion .

As I mentioned in my last post, I have started playing with Azure Functions few weeks ago and I’ve already built few pretty cool solutions. One thing that I’ve spent a lot of time doing research on is how to secure credentials in Azure Functions.

Obviously, Azure Key Vault would be an ideal candidate for storing credentials for Azure services. If I’m using another automation product that I’m quite familiar with – Azure Automation, I’d certainly go down the Key Vault path because Since Azure Automation account already creates a Service Principal for logging into Azure and we can simply grant the Azure AD Application access to the Key Vault. However, and please do point me to the correct direction if I’m wrong, I don’t think there is an easy way to access the Key Vault from Azure Functions at this stage.

I cam across 2 feature requests on both Github and UserVoice suggesting a way to access Key Vault from Azure Functions, so I hope this capability will be added at later stage. But for now, I’ve come up a simple way to encrypt the password in the Azure Functions code so it is not stored in clear text. I purposely want to keep the solution as simple as possible because one of the big advantage of using Azure Functions is being really quick, therefore I believe the less code I have to write the better. I’ll use a PowerShell example to explain what I have done.

I needed to write a function to retrieve Azure VMs from a subscription – and I’ll blog the complete solution next time. Sticking with the language that I know the best, I’m using PowerShell. I have already explained how to use custom PowerShell modules in my last post. In order to retrieve the Azure VMs information, we need two modules:

  • AzureRM.Profile
  • AzureRM.Compute

I use the method explained in the previous post and uploaded the two modules to the function folder. Obviously, I also need to use a credential to sign in to my Azure subscription before retrieving the Azure VM information.

I’m using a key (a byte array) to encrypt the password secure string.  If you are not familiar with this practice, I found a very detailed 2-part blog post on this topic, you can read them here:

Secure Password With PowerShell: Encrypting Credentials – Part 1

Secure Password With PowerShell: Encrypting Credentials – Part 2

So firstly, I’ll need to create a key and store the content to a file:

I then uploaded the key to the Azure Functions folder – I’ve already uploaded the PowerShell modules to the “bin” folder, I created a sub-folder under “bin” called Keys:

image

I wrote a little PowerShell function (that runs on my PC, where a copy of the key file is stored) to encrypt the password.

PowerShell function Get-EncryptedPassword:

I call this function to encrypt the password and copy the encrypted string to the clipboard:

image

I then created two app settings in Azure Functions Application settings:

  • AzureCredUserName
  • AzureCredPassword

The AzureCredUserName has the value of the user name of the service account and AzureCredPassword is the encrypted string that we prepared in the previous step.

image

I then paste the encrypted password string to my Azure Functions code (line 24):

The app settings are exposed to the Azure functions as environment variables, so we can reference them in the script as $env:AzureCredUserName and $env:AzureCredPassword (line 23 and 24)

image

As shown above, to decrypt the password from the encrypted string to the SecureString, the PowerShell code reads the content of the key file and use it as the key to convert the encrypted password to the SecureString (line 26-27). After the password has been converted to the SecureString, we can then create a PSCredential object and use it to login to Azure (line 28-29).

Note: If you read my last post, I have explained how to use Kudu console to find the absolute path of a file, so in this case, the file path of the key file is specified on line 26.

Needless to say, the key file you’ve created must be stored securely. For example, I’m using KeePass to store my passwords, and I’m storing this file in KeePass. Do not leave it in an unsecured location (such as C:\temp as I demonstrated in this example).

Also, Since the app settings apply to all functions in your Azure Functions account, you may consider using different encryption keys in different functions if you want to limit which which function can access a particular encrypted password.

Lastly, as I stated earlier, I wanted to keep the solution as simple as possible. If you know better ways to secure passwords, please do contact me and I’d like to learn from you.