OMSDataInjection Updated to Version 1.2.0

Written by Tao Yang

The OMSDataInjection module was only updated to v1.1.1  less than 2 weeks ago. I had to update it again to reflect the cater for the changes in the OMS HTTP Data Collector API.

I only found out last night after been made aware people started getting errors using this module that the HTTP response code for a successful injection has changed from 202 to 200. The documentation for the API was updated few days ago (as I can see from GitHub):

image

This is what’s been updated in this release:

  • Updated injection result error handling to reflect the change of the OMS HTTP Data Collector API response code for successful injection.
  • Changed the UTCTimeGenerated input parameter from mandatory to optional. When it is not specified, the injection time will be used for the TimeGenerated field in OMS log entry.

If you are using the OMSDataInjection module, I strongly recommend you to update to this release.

PowerShell Gallery: https://www.powershellgallery.com/packages/OMSDataInjection

GitHub: https://github.com/tyconsulting/OMSDataInjection-PSModule/releases/tag/v1.2.0

DSC Resource cPowerShellPackageManagement Module Updated to Version 1.0.0.1

Written by Tao Yang

Back in September this year, I published a PowerShell DSC resource called cPowerSHellPackageManagement. This DSC resource allows you to manage PowerShell repositories and modules on any Windows machines running PowerShell version 5 and later. you can read more about this module from my previous post here: http://blog.tyang.org/2016/09/15/powershell-dsc-resource-for-managing-repositories-and-modules/

Couple of weeks ago my MVP buddy Alex Verkinderen had some issue using this DSC resource in Azure Automation DSC. After some investigation, I found there was a minor bug in the DSC resource. When you use this DSC resource to install modules, sometimes you may get an error like this:

image

Basically, it is complaining that a cmdlet from the module you are trying to install already exists. In order to fix it, I had to update the DSC resource and added –AllowClobber switch to the Install-Module cmdlet.

I have published the updated version to both PowerShell Gallery (https://www.powershellgallery.com/packages/cPowerShellPackageManagement/1.0.0.1) and GitHub (https://github.com/tyconsulting/PowerShellPackageManagementDSCResource/releases/tag/1.0.0.1)

If you are using this DSC resource at the moment, make sure you check out the update.

OMS Search Queries to Extract Rules from Various Assessment Solutions

Written by Tao Yang

Currently in OMS, there are 3 assessment solutions for various Microsoft products. They are:

  • Active Directory Assessment Solution
  • SQL Server Assessment Solution
  • SCOM Assessment Solution

Few days ago, I needed to export the assessment rules from each solution and handover to a customer (so they know exactly what areas are being assessed). So I developed the following queries to extract the details of the assessment rules:

AD Assessment Solution query:

Type=ADAssessmentRecommendation | Dedup Recommendation | select FocusArea,AffectedObjectType,Recommendation,Description | Sort FocusArea

SQL Server Assessment Solution query:

Type=SQLAssessmentRecommendation | Dedup Recommendation | select FocusArea,AffectedObjectType,Recommendation,Description | Sort FocusArea

SCOM Assessment Solution query:

Type=SCOMAssessmentRecommendation | Dedup Recommendation | select FocusArea,AffectedObjectType,Recommendation,Description | Sort FocusArea

In order to use these queries, you need to make sure these solutions are enabled and already collecting data. You may also need to change the search time window to at least last 7 days because by default, assessment solutions only run once a week.

Once you get the result in the OMS portal, you can easily export it to CSV file by hitting the Export button.

Injecting Event Log Export from .evtx Files to OMS Log Analytics

Written by Tao Yang

Over the last few days, I had an requirement injecting events from .evtx files into OMS Log Analytics. A typical .evtx file that I need to process contains over 140,000 events. Since the Azure Automation runbook have the maximum execution time of 3 hours, in order to make the runbook more efficient, I also had to update my OMSDataInjection PowerShell module to support bulk insert (http://blog.tyang.org/2016/12/05/omsdatainjection-powershell-module-updated/).

I have publish the runbook on GitHub Gist:

Note: In order to use this runbook, you MUST use the latest OMSDataInjection module (version 1.1.1) because of the bulk insert.

You will need to specify the following parameters:

  • EvtExportPath – the file path (i.e. a SMB share) to the evtx file.
  • OMSConnectionName – the name of the OMSWorkspace connection asset you have created previously. this connection is defined in the OMSDataInjection module
  • OMSLogTypeName – The OMS log type name that you wish to use for the injected events.
  • BatchLimit – the number of events been injected in a single bulk request. This is an optional parameter, the default value is 1000 if it is not specified.
  • OMSTimeStampFieldName – For the OMS HTTP Data Collector API, you will need to tell the API which field in your log represent the timestamp. since all events extracted from .evtx files all have a “TimeCreated” field, the default value for this parameter is ‘TimeCreated’.

You can further customise the runbook and choose which fields from the evtx events that you wish to exclude. For the fields that you wish to exclude, you need to add them to the $arrSkippedProperties array variable (line 25 – 31). I have already pre-populated it with few obvious ones, you can add and remove them to suit your requirements.

Lastly, sometimes you will get events that their formatted description cannot be displayed. i.e.

image

When the runbook cannot get the formatted description of event, it will use the XML content as the event description instead.

Sample event injected by this runbook:

image

OMSDataInjection PowerShell Module Updated

Written by Tao Yang

I’ve updated the OMSDataInjection PowerShell module to version 1.1.1. I have added support for bulk insert into OMS.

Now you can pass in an array of PSObject or plain JSON payload with multiple log entries. The module will check for the payload size and make sure it is below the supported limit of 30MB before inserting into OMS.

You can get the new version from both PowerShell Gallery and GitHub:

PowerShell Gallery: https://www.powershellgallery.com/packages/OMSDataInjection/1.1.1

GitHub: https://github.com/tyconsulting/OMSDataInjection-PSModule/releases/tag/1.1.1

An Alternative Solution for OMS Capacity Planning Using Power BI Forecasting Feature

Written by Tao Yang

Introduction

Back in September, the Power BI team introduced the Forecasting preview feature in Power BI Desktop. I was really excited to see this highly demanded feature finally been made available. However, it was only a preview feature in Power BI Desktop, it was not available in Power BI online. Few days ago, when the Power BI November update was introduced, this feature has come out of preview and became available also on Power BI Online.

In the cloud and data centre management context, forecasting plays a very important role in capacity planning. Earlier this year, before the OMS Capacity Planning solution V1 has been taken off the shelve, I have written couple of posts comparing OMS Capacity Planning solution and OpsLogix OpsMgr Capacity Report MP, and OpsLogix Capacity Report MP overview. But ever since the OMS Capacity Planning solution was removed, at the moment, we don’t have a capacity planning solution for OMS data sources – the OpsLogix Capacity Report MP is 100% based on OpsMgr.

Power BI Forecasting Feature

When I read the Power BI November update announcement few days ago, I was really excited because the Forecasting feature is finally available on Power BI Online, which means I can use this feature on OMS data sources (such as performance data).

Since I already have configured OMS to pump data to Power BI, it only took me around 15 minutes and I have created an OMS Performance Forecasting report in Power BI:

image

I’m going to show you how to create this report in the remaining of this post.

Step-by-Step Guide

pre-requisites

01. Make sure you have already configured OMS to inject performance data (Type=Perf) to Power BI.

02. Download required Power BI custom visuals

In this report, I’m using two Power BI custom visuals that are not available out of the box, you will need to download the following from the Power BI Visuals Gallery:

Creating the report

01. Click on the data source for OMS perf data, you will see a blank canvas. firstly, import the above mentioned visuals to the report

image

02. Add a text box on the top of the report page for the report title

image

03. Add a Hierarchy Slicer

image

Configure the slicer to filter on the following fields (in the specific order):

  • ObjectName
  • CounterName
  • Computer
  • InstanceName

image

and make sure Single Select on (default value). Optionally, give the visual a title:

SNAGHTML3937892a

04. Add a line chart to the centre of the report. Drag TimeGenerated field to Axis and CounterValue to Values. For CounterValues, choose the average value.

image

Give the visual a title.

image

Note: DO NOT configure the “Legend” field for the line chart visual, otherwise the forecasting feature will be disabled.

05. In the Analytics pane of the Line Chart visual, configure forecast based on your requirements

image

06. Optionally, also add a Trend Line

image

07. Add a Timeline visual to the bottom of the report page and drag the TimeGenerated field from the dataset to to the Time field of the visual.

image

In order to save the screen space, turn of Labels, and give the Timeline visual a title

image

08. Save the report. You can also ping this report page to a dashboard.

Using the Report

Now that the report is created, you can select a counter instance using from the Hierarchy Slicer, and chose a time window that you want the forecasting to be based on from the Timeline slicer. the data on the Line Chart visual will be automatically updated.

2016-12-03_12-40-44

Summary

Comparing to the old OMS Capacity Planning Solution, what I demonstrated here only provides forecasting for individual performance counters. It does not analyse performance data in order to provide a high level overview like what the Capacity Planning solution did. However, since there is no forecasting capabilities in OMS at the moment, this provides a quick and easy way to give you some basic forecasting capabilities.

PowerShell Module for Managing Azure Table Storage Entities

Written by Tao Yang

Azure Storage - TableIntroduction

Firstly, apologies for not being able to blog for 6 weeks. I have been really busy lately.  As part of a project that I’m working on, I have been dealing with Azure Table storage and its REST API over the last couple of weeks. I have written few Azure Function app in C# as well as some Azure Automation runbooks in PowerShell that involves inserting, querying and updating records (entities) in Azure tables. I was struggling a little bit during development of these function apps and runbooks because I couldn’t find too many good code examples and I personally believe this REST API is not well documented on Microsoft’s documentation site (https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/table-service-rest-api). Therefore I have spent the last two days developed a PowerShell module for managing the lifecycle of the Azure Table entities. This module can be used to perform CRUD (Create, Read, Update, Delete) operations for Azure Table entities.

AzureTableEntity PowerShell Module

This PowerShell module is named as AzureTableEntity, it can be located in both GitHub and PowerShell Gallery:

This module offers the following 4 functions:

Get-AzureTableEntity Search Azure Table entities by specifying a search string.
New-AzureTableEntity Insert one or more entities to Azure table storage.
Update-AzureTableEntity Update one or more entities to Azure table storage.
Remove-AzureTableEntity Remove one or more entities to Azure table storage.

Note: All functions have been properly documented in the help file. you can use Get-Help cmdlet to access the help file.

Get-AzureTableEntity

By default when performing query operation, the Azure Table REST API only returns up to 1000 entities or all entities returned from search within 5 seconds. This function has a parameter ‘-GetAll’ that can be used to return all search results from a large table. The default value of this parameter is set to $true.

The search result returned by the search API is deserialised. As the result, complex data type such as datetime is returned as string. If you want any datetime fields from the search result returned as original datetime field, you can set the “-ConvertDateTimeFields” parameter to $true. Please note this would potentially increase the script execution time when dealing with a large set of search result.

Hint: You can easily build your search string using the Azure Storage Explorer.

New-AzureTableEntity

This function can be used to insert a single entity or bulk insert up to 100 entities (and the total payload size is less than 4MB).

Please make sure both “PartitionKey” and “RowKey” are included in the entity. The data type for these fields must be string.

i.e. Instead of setting RowKey = 1, you should set RowKey = “1” – because the value for both PartitionKey and RowKey must be a string.

Update-AzureTableEntity

This function can be used to update a single entity or bulk update up to 100 entities (and the total payload size is less than 4MB).

Please note when updating an entity, all fields (including the fields that do not need to be updated) must be specified. It is actually a merge operation. If you are modifying an existing entity returned from the search operation (Get-AzureTableEntity) and the entity contains datetime fields, please make sure you set “-ConvertDateTimeFields” parameter to $true when performing the search in the first place. Please also be aware that the built-in Timestamp field must not be included in the entity fields.

Remove-AzureTableEntity

This function can be used to remove a single entity or bulk remove up to 100 entities (and the total payload size is less than 4MB).

Support for Azure Automation and SMA

To simply leveraging this module in Azure Automation or SMA, I have included a connection object in the module:

image

Once you have created the connection objects, instead of specifying storage account, table name and storage account access key, you can simply specify the connection object using ‘-TableConnection’ parameter for all four functions.

Sample Code

I have published some sample code I wrote when developing this module to GitHub Gist:

Summary

I wrote this module so I can simplify my Azure Automation runbooks and make IT Pro’s life easier when working on Azure Table storage. If you have to deal with Azure Table storage, I hope you find this module useful. If you are a developer and looking for code samples, you can still use this module and simply translate the code to the language of your choice.

I purposely didn’t include any functions for managing the Azure table storage itself because you can manage the Table storage using the Azure.Storage module.

Lastly, feedbacks are always welcome, so please drop me an email if you have any.

My Meetup Recording–Developing Your OWN OMS Solutions

Written by Tao Yang

Last month, I presented at the Melbourne Microsoft Cloud and Datacenter Meetup on the topic “Developing Your OWN OMS Solutions” (https://www.meetup.com/Melbourne-Microsoft-Cloud-and-Datacenter-Meetup/events/233154212/). I recorded the session but then realise the recording had some technical errors due to the change of screen resolution without restarting Camtasia. This morning, I re-recorded the session and uploaded to the Meetup’s YouTube Channel.

If you are interested, you can watch the recording here (https://www.youtube.com/watch?v=zUzI31iIcTk):

And you can also download the slide deck HERE.

Feeding Your Power BI Reports from Azure Functions

Written by Tao Yang

Background

Few days ago my good friend and fellow CDM MVP Alex Verkinderen (@AlexVerkinderen) had a requirement to produce a Power BI dashboard for Azure AD users. so Alex and I started discussing a way to produce such report in Power BI. After exploring various potential possibilities, we have decided to leverage Azure Functions to feed data into Power BI. You can check out the Power BI solution Alex has built on his blog here: http://www.mscloud.be/retrieve-azure-aad-user-information-with-azure-functions-and-publish-it-into-powerbi

In this blog post, I’m not going to the details of how the AAD Users Power BI report was built. Instead, I will focus on the Azure Functions component and briefly demonstrate how to build a Azure Functions web service and act as a Power BI data source. As an example for this post, I’ll build a Azure Functions web service in PowerShell that brings in Azure VMs information into Power BI. To set the stage, I have already written two blog posts yesterday on Azure Functions:

These two posts demonstrated two important steps that we need to prepare for the Azure Functions PowerShell code. We will need to follow these posts and prepare the following:

  • Upload the latest AzureRM.Profile and AzureRM.Compute PowerShell modules to Azure Functions
  • Encrypt the password for the service account to be used to access the Azure subscription.

Once done, we need to update the user name and the encrypted password in the code below (line 24 and 25)

I have configured the function authorization level to “Function” which means I need to pass an API key when invoking the  function. I also need to pass the Azure subscription Id via the URL. To test, I’m using the Invoke-WebRequest cmdlet and see if I can retrieve the Azure VMs information:

As you can see, the request body content contains a HTML output which contains a table for the Azure VM information

image

Now that I’ve confirmed the function is working, all I need to do is to use Power BI to get the data from the web.

Note: I’m not going to too deep in Power BI in this post, therefore I will only demonstrate how to do so in Power BI desktop. However Alex’s post has covered how to configure such reports in Power BI Online and ensuring the data is always up-to-date by leveraging the On-Prem Data Gateway component. So, please make sure you also read Alex’s post when you are done with this one.

image

In Power BI Desktop, simply enter the URL with the basic setting:

image

and choose “Table 0”:

image

Once imported, you can see the all the properties I’ve defined in the Azure Functions PowerShell script has been imported in the dataset:

image

and I’ve used a table visual in the Power BI report and listed all the fields from the dataset:

image

Since the purpose of this post is only to demonstrate how to use Azure Functions as the data source for Power BI, I am only going to demonstrate how to get the data into Power BI. Creating fancy reports and dashbaords for Azure VM data is not what I intent to cover.

Now that the data is available in Power BI, you can be creative and design fancy reports using different Power BI visuals.

Note: The method described in this post may not work when you want to refresh your data after published your report to Power BI Online. You may need to use this C# Wrapper function: http://blog.tyang.org/2016/10/13/making-powershell-based-azure-functions-to-produce-html-outputs/. Alex has got this part covered in his post.

Lastly, make sure you go check out Alex’s post on how he created the AAD Users report using this method. As I mentioned, he has also covered two important aspects – how to make this report online (so you can share with other people) and how to make sure you data is always up to date by using the on-prem data gateway.

Making PowerShell Based Azure Functions to Produce HTML Outputs

Written by Tao Yang

Over the last few weeks, I’ve been working with my MVP buddy Alex Verkinderen (@AlexVerkinderen) on some Azure Function related stuff. We have both written few PowerShell based functions that output a HTML page.

These functions use the ConvertTo-HTML cmdlet to produce the HTML output. For example, here’s a simple one that  list 2 cars in a HTML table:

Today we ran into an issue while preparing for our next blog posts, after some diagnostics, we realised the issue was caused by the HTML output returned from the PowerShell based functions.

If I use Invoke-WebRequest cmdlet in Powershell to trigger this PowerShell function, I am able to get the HTML output in the request output content and everything looks good:

image

However, if we simply invoke this function from a browser, although the output is in HTML format, the browser does not display the HTML page. it displays the HTML source code instead:

image

after some research, we found the cause of this issue – the content type returned by the PowerShell function is always set to “text/plain”:

image

I suspect this is because for PowerShell based functions, we have to output to a file ($res variable by default). I have tried to construct a proper HTTP response message (System.Net.Http.HttpResponseMessage), but it didn’t work in the PowerShell functions. Based on my testing results, it seems PowerShell functions cannot handle complex types.

Luckily I found this post and it pointed me to the right direction: http://anthonychu.ca/post/azure-functions-serve-html/. According on this post, we can certainly serve out a proper HTML page in C# based functions.

I don’t really want to rewrite all my PowerShell functions to C#, not only because I don’t want to reinvent the wheels, but also I want to keep using the PowerShell modules in those existing functions. In the end, I came up with C# based “wrapper” function. I named this function HTTPTriggerProxy:

This C# based HTTPTriggerProxy function simply takes the URL you have specified, get the response and wrap it in a proper HTTPResponseMessage object. All you need to do is to specify the original URL that you want to request in the “RequestURL” parameter as part of the wrapper function URL:

https://<Your Azure Function Account>.azurewebsites.net/api/HttpTriggerProxy?code=<Access code for Http Trigger Proxy function>&RequestURL=<Your original request URL>.

Now if I use this wrapper to invoke the sample GetCars PowerShell function, the HTML page is displayed in the browser as expected:

image

and you can see the content type is now set as “text/html”:

image

Note:

  • This wrapper function only supports the Get HTTP method. The Post method is not supported so you can only pass the RequestURL in the the wrapper URL (as opposed to placing it in the request body). I didn’t bother to cater the POST method in this function because what we are going to use this for only supports HTTP Get method.
  • if your original request requires authentication, then this is not going to work for you.
  • If you original URL contains the ampersand character (“&”), please replace it with “%26”. for example, if your original request is https://myazurefunction.azurewebsites.net/api/GetCars?code=rgpxmm0p87fh2z1wd0a6vargfxxogb6cf&colour=red, then you need to change it to https://myazurefunction.azurewebsites.net/api/GetCars?code=rgpxmm0p87fh2z1wd0a6vargfxxogb6cf%26colour=red

Lastly, this is just something we came up today while making another set of posts. Please stay turned. our new posts will be published in the next day or two.