Author Archives: Tao Yang

My Meetup Recording–Developing Your OWN OMS Solutions

Written by Tao Yang

Last month, I presented at the Melbourne Microsoft Cloud and Datacenter Meetup on the topic “Developing Your OWN OMS Solutions” ( I recorded the session but then realise the recording had some technical errors due to the change of screen resolution without restarting Camtasia. This morning, I re-recorded the session and uploaded to the Meetup’s YouTube Channel.

If you are interested, you can watch the recording here (

And you can also download the slide deck HERE.

Feeding Your Power BI Reports from Azure Functions

Written by Tao Yang


Few days ago my good friend and fellow CDM MVP Alex Verkinderen (@AlexVerkinderen) had a requirement to produce a Power BI dashboard for Azure AD users. so Alex and I started discussing a way to produce such report in Power BI. After exploring various potential possibilities, we have decided to leverage Azure Functions to feed data into Power BI. You can check out the Power BI solution Alex has built on his blog here:

In this blog post, I’m not going to the details of how the AAD Users Power BI report was built. Instead, I will focus on the Azure Functions component and briefly demonstrate how to build a Azure Functions web service and act as a Power BI data source. As an example for this post, I’ll build a Azure Functions web service in PowerShell that brings in Azure VMs information into Power BI. To set the stage, I have already written two blog posts yesterday on Azure Functions:

These two posts demonstrated two important steps that we need to prepare for the Azure Functions PowerShell code. We will need to follow these posts and prepare the following:

  • Upload the latest AzureRM.Profile and AzureRM.Compute PowerShell modules to Azure Functions
  • Encrypt the password for the service account to be used to access the Azure subscription.

Once done, we need to update the user name and the encrypted password in the code below (line 24 and 25)

I have configured the function authorization level to “Function” which means I need to pass an API key when invoking the  function. I also need to pass the Azure subscription Id via the URL. To test, I’m using the Invoke-WebRequest cmdlet and see if I can retrieve the Azure VMs information:

As you can see, the request body content contains a HTML output which contains a table for the Azure VM information


Now that I’ve confirmed the function is working, all I need to do is to use Power BI to get the data from the web.

Note: I’m not going to too deep in Power BI in this post, therefore I will only demonstrate how to do so in Power BI desktop. However Alex’s post has covered how to configure such reports in Power BI Online and ensuring the data is always up-to-date by leveraging the On-Prem Data Gateway component. So, please make sure you also read Alex’s post when you are done with this one.


In Power BI Desktop, simply enter the URL with the basic setting:


and choose “Table 0”:


Once imported, you can see the all the properties I’ve defined in the Azure Functions PowerShell script has been imported in the dataset:


and I’ve used a table visual in the Power BI report and listed all the fields from the dataset:


Since the purpose of this post is only to demonstrate how to use Azure Functions as the data source for Power BI, I am only going to demonstrate how to get the data into Power BI. Creating fancy reports and dashbaords for Azure VM data is not what I intent to cover.

Now that the data is available in Power BI, you can be creative and design fancy reports using different Power BI visuals.

Note: The method described in this post may not work when you want to refresh your data after published your report to Power BI Online. You may need to use this C# Wrapper function: Alex has got this part covered in his post.

Lastly, make sure you go check out Alex’s post on how he created the AAD Users report using this method. As I mentioned, he has also covered two important aspects – how to make this report online (so you can share with other people) and how to make sure you data is always up to date by using the on-prem data gateway.

Making PowerShell Based Azure Functions to Produce HTML Outputs

Written by Tao Yang

Over the last few weeks, I’ve been working with my MVP buddy Alex Verkinderen (@AlexVerkinderen) on some Azure Function related stuff. We have both written few PowerShell based functions that output a HTML page.

These functions use the ConvertTo-HTML cmdlet to produce the HTML output. For example, here’s a simple one that  list 2 cars in a HTML table:

Today we ran into an issue while preparing for our next blog posts, after some diagnostics, we realised the issue was caused by the HTML output returned from the PowerShell based functions.

If I use Invoke-WebRequest cmdlet in Powershell to trigger this PowerShell function, I am able to get the HTML output in the request output content and everything looks good:


However, if we simply invoke this function from a browser, although the output is in HTML format, the browser does not display the HTML page. it displays the HTML source code instead:


after some research, we found the cause of this issue – the content type returned by the PowerShell function is always set to “text/plain”:


I suspect this is because for PowerShell based functions, we have to output to a file ($res variable by default). I have tried to construct a proper HTTP response message (System.Net.Http.HttpResponseMessage), but it didn’t work in the PowerShell functions. Based on my testing results, it seems PowerShell functions cannot handle complex types.

Luckily I found this post and it pointed me to the right direction: According on this post, we can certainly serve out a proper HTML page in C# based functions.

I don’t really want to rewrite all my PowerShell functions to C#, not only because I don’t want to reinvent the wheels, but also I want to keep using the PowerShell modules in those existing functions. In the end, I came up with C# based “wrapper” function. I named this function HTTPTriggerProxy:

This C# based HTTPTriggerProxy function simply takes the URL you have specified, get the response and wrap it in a proper HTTPResponseMessage object. All you need to do is to specify the original URL that you want to request in the “RequestURL” parameter as part of the wrapper function URL:

https://<Your Azure Function Account><Access code for Http Trigger Proxy function>&RequestURL=<Your original request URL>.

Now if I use this wrapper to invoke the sample GetCars PowerShell function, the HTML page is displayed in the browser as expected:


and you can see the content type is now set as “text/html”:



  • This wrapper function only supports the Get HTTP method. The Post method is not supported so you can only pass the RequestURL in the the wrapper URL (as opposed to placing it in the request body). I didn’t bother to cater the POST method in this function because what we are going to use this for only supports HTTP Get method.
  • if your original request requires authentication, then this is not going to work for you.
  • If you original URL contains the ampersand character (“&”), please replace it with “%26”. for example, if your original request is, then you need to change it to

Lastly, this is just something we came up today while making another set of posts. Please stay turned. our new posts will be published in the next day or two.

Securing Passwords in Azure Functions

Written by Tao Yang

09/10/2016 – Note: This post has been updated as per David O’Brien’s suggestion .

As I mentioned in my last post, I have started playing with Azure Functions few weeks ago and I’ve already built few pretty cool solutions. One thing that I’ve spent a lot of time doing research on is how to secure credentials in Azure Functions.

Obviously, Azure Key Vault would be an ideal candidate for storing credentials for Azure services. If I’m using another automation product that I’m quite familiar with – Azure Automation, I’d certainly go down the Key Vault path because Since Azure Automation account already creates a Service Principal for logging into Azure and we can simply grant the Azure AD Application access to the Key Vault. However, and please do point me to the correct direction if I’m wrong, I don’t think there is an easy way to access the Key Vault from Azure Functions at this stage.

I cam across 2 feature requests on both Github and UserVoice suggesting a way to access Key Vault from Azure Functions, so I hope this capability will be added at later stage. But for now, I’ve come up a simple way to encrypt the password in the Azure Functions code so it is not stored in clear text. I purposely want to keep the solution as simple as possible because one of the big advantage of using Azure Functions is being really quick, therefore I believe the less code I have to write the better. I’ll use a PowerShell example to explain what I have done.

I needed to write a function to retrieve Azure VMs from a subscription – and I’ll blog the complete solution next time. Sticking with the language that I know the best, I’m using PowerShell. I have already explained how to use custom PowerShell modules in my last post. In order to retrieve the Azure VMs information, we need two modules:

  • AzureRM.Profile
  • AzureRM.Compute

I use the method explained in the previous post and uploaded the two modules to the function folder. Obviously, I also need to use a credential to sign in to my Azure subscription before retrieving the Azure VM information.

I’m using a key (a byte array) to encrypt the password secure string.  If you are not familiar with this practice, I found a very detailed 2-part blog post on this topic, you can read them here:

Secure Password With PowerShell: Encrypting Credentials – Part 1

Secure Password With PowerShell: Encrypting Credentials – Part 2

So firstly, I’ll need to create a key and store the content to a file:

I then uploaded the key to the Azure Functions folder – I’ve already uploaded the PowerShell modules to the “bin” folder, I created a sub-folder under “bin” called Keys:


I wrote a little PowerShell function (that runs on my PC, where a copy of the key file is stored) to encrypt the password.

PowerShell function Get-EncryptedPassword:

I call this function to encrypt the password and copy the encrypted string to the clipboard:


I then created two app settings in Azure Functions Application settings:

  • AzureCredUserName
  • AzureCredPassword

The AzureCredUserName has the value of the user name of the service account and AzureCredPassword is the encrypted string that we prepared in the previous step.


I then paste the encrypted password string to my Azure Functions code (line 24):

The app settings are exposed to the Azure functions as environment variables, so we can reference them in the script as $env:AzureCredUserName and $env:AzureCredPassword (line 23 and 24)


As shown above, to decrypt the password from the encrypted string to the SecureString, the PowerShell code reads the content of the key file and use it as the key to convert the encrypted password to the SecureString (line 26-27). After the password has been converted to the SecureString, we can then create a PSCredential object and use it to login to Azure (line 28-29).

Note: If you read my last post, I have explained how to use Kudu console to find the absolute path of a file, so in this case, the file path of the key file is specified on line 26.

Needless to say, the key file you’ve created must be stored securely. For example, I’m using KeePass to store my passwords, and I’m storing this file in KeePass. Do not leave it in an unsecured location (such as C:\temp as I demonstrated in this example).

Also, Since the app settings apply to all functions in your Azure Functions account, you may consider using different encryption keys in different functions if you want to limit which which function can access a particular encrypted password.

Lastly, as I stated earlier, I wanted to keep the solution as simple as possible. If you know better ways to secure passwords, please do contact me and I’d like to learn from you.

Using Custom PowerShell Modules in Azure Functions

Written by Tao Yang

Like many other fellow MVPs, I have started playing with Azure Functions over the last few weeks. Although Azure Functions are primarily designed for developers and supports languages such as C#, Node.JS, PHP, etc. PowerShell support is currently in preview. This opens a lot of opportunities for IT Pros. My friend and fellow CDM MVP David O’Brien has written some really good posts on PowerShell in Azure Functions ( Although the PowerShell runtime in Azure Functions comes with a lot of Azure PowerShell modules by default (refer to David’s post here for details), these modules are out-dated, and some times, we do need to leverage other custom modules that are not shipped by default.

While I was trying to figure out a way to import custom modules into my PowerShell Azure Functions, I came across this post showing me how to upload 3rd party assemblies for C# functions: So basically for adding assemblies for C#, you will need to create a folder called “bin” under your function root folder, and upload the DLL to the newly created folder using a FTP client. I thought I’d give this a try for PowerShell modules, and guess what? it worked! I’ll use one of my frequently used module called GAC as an example in this post and work through the process of how to prepare the module and how to use it in my PowerShell code.

01. I firstly download the Gac module from the PowerShell Gallery (

02. Make sure the Azure Functions App Service has the deployment credential configured


03. FTP to the App Service using the deployment credential configured in the preview step, create a “bin” folder under the Azure Functions folder (“/site/wwwroot/<Azure Functions Name>”) and upload the module folder:


04. In Azure Functions, launch the Kudu console


05. Identify the PowerShell module file system path in Kudu. The path is D:\home\site\wwwroot\<Azure Function Name>\bin\<PS module name>\<PS module version>


06. By default, the PowerShell runtime is configured to run on 32-bit platform. If the custom module requires 64-bit platform, you will need to configure the app setting and set the Platform to 64-bit



Now that the module is uploaded, and because the module is not located in a folder that’s listed in the PSModulePath environment variable, we have to explicitly import the module manifest (.psd1 file) before using it. For example, I have created a function with only 2 lines of code as shown below:

The “Get-GacAssembly” cmdlet comes from the Gac PowerShell module. As the name suggests, it lists all the assemblies located in the Gac (Global Assemblies Cache). When I call the HTTP trigger function using Invoke-WebRequest, you’ll see the assemblies listed in the logs window:



I have also tested stopping and restarting the Azure App Service, and I can confirm the module files stayed at the original location after the restart based on my tests.

This concludes my topic for today. I have few other really cool blogs in the pipeline for using PowerShell in Azure Functions, stay tuned.

Squared Up Upcoming V3 Dashboard with Distributed Application Discovery Feature

Written by Tao Yang

Squared Up is set to release the version 3 of their dashboard next week at Ignite North America. One of the key features in the v3 release is called the “Visual Application Discovery & Analysis” (aka VADA).

VADA utilise OpsMgr agent tasks and netstat.exe command to discover the other TCP/IP endpoints the agents are communicating to. You can learn more about this feature from a short YouTube video Squared Up has published recently:

I was given a trail copy of v3 for my lab. After I’ve installed it and imported the required management pack, I was able to start discovering the endpoints that are communicating to my OpsMgr agents in the matter of few clicks:


As we all know, natively, OpsMgr is lacking the capability of automatically Distributed Application discovery, customers used to integrate 3rd party applications such as BlueStripe FactFinder with OpsMgr for this capability. However, now that BlueStripe has been acquired by Microsoft and it’s being fitted under the OMS banner as the Application Dependency Monitor solution (ADM), customers can no longer purchase it for OpsMgr. It is good to see that Squared Up has released something with similar capabilities because at this very moment, it seems to be a gap in the OpsMgr space.

Having said that, I don’t think the OMS ADM solution is too far away from the public preview release.


One of the biggest differences I can see (after spending couple of hours on Squared Up V3), is that Squared Up VADA collects ad-hoc data at the time VADA is launched (which triggers the agent ask), whereas OMS ADM has it’s own agents and it is collecting data continuously.


Additionally, looks like Squared Up VADA only supports Windows agents at this stage and OMS ADM will also support Linux agents.

At this stage, since we don’t know  if BlueStripe will be made available to OpsMgr in the future, and Squared Up is releasing this awesome addition to their already-popular OpsMgr web console / dashboard product, why not give it a try and see what you can produce? I guess since the data collection is ad-hoc, it will make more sense to start the discovery in VADA during peak hours when the system is fully loaded and each components are actively communicating to each other, so you don’t miss any components.

Lastly, if you are going to attend Ignite NA next week and want to learn more about this new feature in Squared Up V3, please make sure you go find them at their booth.

Pushing PowerShell Modules From PowerShell Gallery to Your MyGet Feeds Directly

Written by Tao Yang


Recently I have started using a private MyGet feed and my cPowerShellPackageManagement DSC Resource module to manage PowerShell modules on my lab servers.

When new modules are released in PowerShell Gallery (i.e. all the Azure modules), I’d normally use Install-Module to install on test machines, then publish the tested modules to my MyGet feed and then my servers would pick up the new modules.

Although I can use public-module cmdlet to upload the module located locally on my PC to MyGet feed, it can be really time consuming when the module sizes are big (i.e. some of the Azure modules). It only took me few minutes to figure out how do I push modules directly from PowerShell Gallery (or any NuGet feeds) to my MyGet feed.

To configure it, Under the MyGet feed, go to “Package Sources”, and click “Add package source…”


Then choose NuGet feed, fill out name and source

Name: PowerShellGallery



Once added, I can search PowerShell Gallery and add packages directly to MyGet.



Scripting Azure Automation Module Imports Directly from MyGet or PowerShell Gallery

Written by Tao Yang

There are few ways to add PowerShell modules to Azure Automation accounts:

1. Via the Azure Portal by uploading the module zip file from local computer.


2. If the module is located in PowerShell Gallery, you can push it to your Automation Account directly from PowerShell Gallery.


3. Use PowerShell cmdlet New-AzureRmAutomationModule from the AzureRM.Automation module.

One of the limitation of using New-AzureRMAutomationModule cmdlet is, the module must be zipped and located somewhere online that Azure has access to. You will need to specify the location by using the –ContentLink parameter. In the past, in order to script the module deployment, even when the module is located in PowerShell Gallery, I had to save the module to a place where my Automation Account has access to (such as an Azure blob storage, or creating a release in a public Github repo).

Tonight, I was writing a script and I wanted to see if I can deploy modules to my Automation Account directly from a package repository of my choice – other than PowerShell Gallery, I also have a private MyGet feed that I use for storing my PowerShell modules.

It turned out to be really easy to do so, only took me few minutes to figure out how. I’ll use a module I wrote in the past called “SendEmail” as an example. It is published in both PowerShell Gallery, and my private MyGet feed.

Importing from PowerShell Gallery

the URL for this module in PowerShell Gallery is:

The –ContentLink URI that we need to pass to the Add-AzureRmAutomationModule cmdlet would be:

As you can see, all you need to do is to add “api/v2/” in the URI. The PowerShell command would be something like this:

Importing from a private MyGet feed

For a private MyGet feed, you can access it by embedding the API key into the URL:


The URL for my module would be: “<Your MyGet feed name>/auth/<MyGet API Key>/api/v2/package/<Module Name>/<Module Version>

i.e. for my SendEmail module, the PowerShell command would be something like this:

Importing from a public MyGet feed

If the module is located in a public MyGet feed, then the API key is not required. the URI for the module would be very similar to PowerShell Gallery, you will just need to embed “api/v2/” in to the original URI:

‘<MyGet Public Feed Name>/api/v2/package/<Module Name>/<Module Version>

the PowerShell script would be something like this:

PowerShell DSC Resource for Managing Repositories and Modules

Written by Tao Yang


PowerShell version 5 has introduced a new feature that allows you to install packages (such as PowerShell modules) from NuGet repositories. If you have used cmdlets such as Find-Module, Install-Module or Uninstall-Module, then you have already taken advantage of this awesome feature.

By default, a Microsoft owned public repository PowerShell Gallery is configured on all computers running PowerShell version 5 and when you use Find-Module or Install-Module, you are pulling the modules from the PowerShell Gallery.

Ever since I started using PowerShell v5, I’ve discovered some challenges managing modules for machines in my environment:

  • Lack of a fully automated way to push modules to a group of computers
  • Module version inconsistency between computers
  • Need of a private repository

Let me elaborate each of the point listed above.

Lack of a fully automated way to push modules to a group of computers

Back in the old days (pre WMF v5), I used to package PowerShell modules to msi’s and use ConfigMgr to deploy the msi to target computers. although it’s not too hard to a package module to msi, this method is really time consuming, not to mention it also requires ConfigMgr. In PowerShell v5, I can write a script that utilise PowerShell remoting to push modules to remote machines, this is still a manual process, and it may not be a viable solution for a large group of computers.

Module version inconsistency between computers

over the time, modules get updated, new modules get released from various sources. I often find module version become inconsistent among computers. there is no automated ways to update computers when a new version is released.

Need of a private repository

PowerShell Gallery is public. everything you publish to it will be available for the entire world. Organisations often write modules specifically for internal use, and may not want to share it with the rest of the world.

Before I dive into the main topic, I’d like to discuss what I have done for implementing private repositories.

Private Repositories

PowerShell PackageManagement uses NuGet repositories. I found the following solutions available:

MyGet is a SaaS (Software as a Service) based repository hosted on the cloud. Although you can create your own feeds, private feeds come with a price tag (free accounts allow you to create public feeds that everyone can access).

ProGet is a on-premises solution. To install it, you will need a web server (and optionally a SQL server) within your network. It comes with free, basic and enterprise editions. the feature comparison is located here:

Since both MyGet and ProGet offer NFR (Not For Resell) licenses to Microsoft MVPs, I have tested both for my lab environment. They both work pretty well. I did not bother to setup the free private NuGet repository (the 3rd option).

These days, I found myself writing more and more PowerShell modules for different projects. During development phase, I’d normally use a feed that’s hosted on my ProGet server because it is located in my lab, so it’s faster to publish and download modules. Once the module is ready, I’d normally publish it to MyGet for general consumption because it’s a SaaS based application, both my lab machines and Azure IaaS machines will have no problem accessing it.

DSC Resource cPowerShellPackageManagement

In order to overcome the other two challenges that I’m facing (module automatically deployment and version inconsistency), I have created a DSC resource called cPowerShellPackageManagement.

According to the DSC namingstandard, the first letter ‘c’ indicates it is a community resource, and as the rest of the name suggests, it is used to manage PowerShell packages.

This DSC resource module contains 2 resources:

  • cPowerShellRepository – used to register or unregister specific NuGet feeds on computers running PowerShell v5 and above.
  • cPowerShellModuleManagement – used to install / uninstall modules on computers running PowerShell v5 and aove



To register a feed, you will need to specify some basic information such as PublishLocation and SourceLocation. You can also set Ensure = Absent to unregister the feed with the name specified in the Name parameter.

When not specified, the InstallationPolicy field default value is “Untrusted”. If you’d like to set the repository as a trusted repository, set this value to “Trusted”.

Note: since the repository registration is based on each user (as opposed to machine based settings) and DSC configuration is executed under LocalSystem context. you will not be able to see the repository added by this resource if you run Get-PSRepository cmdlet under your own user account. If you start PowerShell under LocalSystem by using PsExec (run psexec /i /s /d powershell.exe), you will be able to see the repository:




  • PSModuleName – PowerShell module name. When this is set to ‘all’, all modules from the specified repository will be installed. So please do not use ‘all’ against PSGallery!!
  • RepositoryName – Name of the repository where module will be installed from. This can be a public repository such as PowerShell Gallery, or your privately owned repository (i.e. your ProGet or MyGet feeds). You can use the cPowerShellRepository resource to configure the repository.
  • PSModuleVersion – This is an optional field. when used, only the specified version will be installed (or uninstalled). If not specified, the latest version of the module from the repository will be used. This field will not impact other versions that are already installed on the computer (i.e. when installing the latest version, earlier versions will not be uninstalled).
  • MaintenanceStartHour, MaintenanceStartMinute and MaintenanceLengthMinute – Since the LCM will run the DSC configuration on a pre-configured interval, you may not want to install / uninstall modules during business hours. Therefore, you can set the maintenance start hour (0-23) and start minute (0-59) to specify the start time of the maintenance window. MaintenanceLengthMinute represents the length of the maintenance window in minutes. These fields are optional, when specified, module installation and uninstallation will only take place when the LCM runs the configuration within the maintenance window. Note: Please make sure the MaintenanceLengthMinute is greater than the value configured for the LCM ConfigurationModeFrequencyMins property.


Sample Configuration

Here are some sample configurations to demonstrate the usage of these DSC resources.

1. Register to an On-Prem ProGet feed and install all modules from the feed

Using this configuration, I can manage the modules from the repository feed level. if I add or update a module to the feed, the DSC LCM on each configured compute will automatically install the newly added (or updated) module when next time the configuration is refreshed.

2. Register to a feed hosted on MyGet, and install several specific modules

In this example, I’ve specified a particular module can be installed at any time (the Gac module), and another module can only be installed (or updated) at a specific time window (the SharePointSDK module).

Download and Install Locations

This DSC Resource has been published to PowerShellGallery:

The project is also located on Github:

Special Thanks

I’d like to thank my MVP friends Jakob G Svendsen (@JakobGSvendsen), Pete Zerger (@pzerger), Daniele Grandini (@DanieleGrandini) and James Bannan (@JamesBannan) who provided feedback and helped me testing the modules.

PowerShell Module for OMS HTTP Data Collector API

Written by Tao Yang


Earlier today, the OMS Product Group has released the OMS HTTP Data Collection API to public preview. If you haven’t read the announcement, you can read this blog post written by the PM of this feature, Evan Hissey first.

As a Cloud and Datacenter Management MVP, I’ve had private preview access to this feature for few months now, and I actually even developed a solution using this API in a customer engagement with my friend and fellow CDM MVP Alex Verkinderen (@AlexVerkinderen) just over a month ago. I was really impressed with the potential opportunities this feature may bring to us, I’ve been spamming Evan’s inbox asking him for the release date of this feature so I can blog about it and also present this in user group meetups.

Since most of us wouldn’t like having to deal with HTTP headers, bodies, authorizations and other overhead we have to put into our code in order to use this API, I have developed a PowerShell module to help us easily utilize this API.

Introducing OMSDataInjection PowerShell Module

This module was developed about 2 months ago, I was waiting for the API to become public so I can release this module. So now the wait is over, I can finally release it.

This module contains only one public function: New-OMSDataInjection. This function is well documented in a proper help file. you can access it via Get-Help New-OMSDataInjection –Full. I have added 2 examples in the help file too:

————————– EXAMPLE 1 ————————–

PS C:\>$PrimaryKey = Read-Host -Prompt ‘Enter the primary key’
$ObjProperties = @{
Computer = $env:COMPUTERNAME
Username = $env:USERNAME
Message  = ‘This is a test message injected by the OMSDataInjection module. Input data type: PSObject’
LogTime  = [Datetime]::UtcNow
$OMSDataObject = New-Object -TypeName PSObject -Property $ObjProperties
$InjectData = New-OMSDataInjection -OMSWorkSpaceId ‘8eb61d08-133c-401a-a45b-0e611194779f’ -PrimaryKey $PrimaryKey -LogType ‘OMSTestData’ -UTCTimeStampField ‘LogTime’ -OMSDataObject $OMSDataObject

Injecting data using a PS object by specifying the OMS workspace Id and primary key
————————– EXAMPLE 2 ————————–

PS C:\>$OMSConnection = Get-AutomationConnection ‘OMSConnection’
$OMSDataJSON = @”
“Username”:  “administrator”,
“Message”:  “This is a test message injected by the OMSDataInjection module. Input data type: JSON”,
“LogTime”:  “Tuesday, 28 June 2016 9:08:15 PM”,
“Computer”:  “SERVER01”
$InjectData = New-OMSDataInjection -OMSConnection $OMSConnection -LogType ‘OMSTestData’ -UTCTimeStampField ‘LogTime’ -OMSDataJSON $OMSDataJSON

Injecting data using JSON formatted string by specifying the OMSWorkspace Azure Automation / SMA connection object (to be used in a runbook)

This PS module comes with the following features:

01. A Connection object for using this module in Azure Automation and SMA.

Once imported into your Azure Automation account (or SMA for the ‘old skool’ folks), you will be able to create connection objects that contains your OMS workspace Id, primary key and secondary key (optional):


And as shown in Example 2 listed above, in your runbook, you can retrieve this connection object and use it when calling the New-OMSDataInjection function.

02. Fall back to the secondary key if the primary key has failed

When the optional secondary key is specified, if the web request using the primary key fails, the module will fall back to the secondary key and try the web request again using the secondary key. This is to ensure your script / automation runbooks will not be interrupted when you are in the process of  following the best practice and cycling through your keys.

03. Supports two types of input: JSON and PSObject

As you can see from Evan’s post, this API is expecting a JSON object as the HTTP body which contains the data to be injected into OMS. When I started testing this API few months ago, my good friend and fellow MVP Stanislav Zhelyazkov (@StanZhelyazkov) suggested me instead of writing plain JSON format, it’s better to put everything into a PSObject then convert it to JSON in PowerShell so we don’t mess with the format and type of each field. I think it was a good idea, so I have coded the module to take either JSON format, or a PSObject that contains the data to be injected into OMS.

Sample Script  and Runbook

I’ve created a sample script and a runbook to help you get started. They are also included in the Github repository for this module (link at the bottom of this article):

Sample Script: Test-OMSDataInjection.ps1

Sample Runbook: Test-OMSDataInjectionRunbook

Exploring Data in OMS

Once the data is injected into OMS, if you are using a new data type,  it can take a while (few hours) for all the fields to be available in OMS.

i.e. the data injected by the sample script and Azure Automation runbook (executed on Azure):


all the fields that you have defined are stored as custom fields in your OMS workspace:


Please keep in mind, since the Custom Fields feature is still at the preview phase, there’s a limit of 100 custom fields per workspace at this stage (, so please be mindful of this limitation when you are building your custom solutions using the HTTP Data Collector API.

Where to Download This Module?

I have published this module in PowerShell Gallery:, if you are using PowerShell version 5 and above, you can install it directly from it: Install-Module –Name OMSDataInjection –Repository PSGallery

You can also download it from it’s GitHub repo:


In the past, we’ve had the OMS Custom View Designer that can help us visualising the data that we already have in OMS Log Analytics, what we were missing is a native way to inject data into OMS. Now with the release of this API, the gap has been filled. Like Evan mentioned in his blog post, by coupling this API with the OMS View Designer (and even throwing Power BI into the mix), you can develop some really fancy solutions.

On 21st of September (3 weeks from now), I will be presenting at the Melbourne Microsoft Cloud and Datacenter Meetup (, my topic is Developing Your OWN Custom OMS Solutions. I will doing live demos creating solutions using the HTTP Data Collector API as well as the Custom View Designer. If you are from Melbourne, I encourage you to attend. I am also planning to record this session and publish it on YouTube later.

Lastly, if you have any suggestions for this PowerShell module, please feel free to contact me!