Author Archives: Tao Yang
Microsoft’s PFE Wei Hao Lim has published an awesome blog post that maps OpsMgr ACS reports to OMS search queries (https://blogs.msdn.microsoft.com/wei_out_there_with_system_center/2016/07/25/mapping-acs-reports-to-oms-search-queries/)
There are 36 queries on Wei’s list, so it will take a while to manually create them all as saved searches via the OMS Portal. Since I can see that I will reuse these saved searches in many OMS engagements, I have created a script to automatically create them using the OMS PowerShell Module AzureRM.OperationalInsights.
So here’s the script:
You must run this script in PowerShell version 5 or later. Lastly, thanks Wei for sharing these valuable queries with the community!
The OMSDataInjection module was only updated to v1.1.1 less than 2 weeks ago. I had to update it again to reflect the cater for the changes in the OMS HTTP Data Collector API.
I only found out last night after been made aware people started getting errors using this module that the HTTP response code for a successful injection has changed from 202 to 200. The documentation for the API was updated few days ago (as I can see from GitHub):
This is what’s been updated in this release:
- Updated injection result error handling to reflect the change of the OMS HTTP Data Collector API response code for successful injection.
- Changed the UTCTimeGenerated input parameter from mandatory to optional. When it is not specified, the injection time will be used for the TimeGenerated field in OMS log entry.
If you are using the OMSDataInjection module, I strongly recommend you to update to this release.
PowerShell Gallery: https://www.powershellgallery.com/packages/OMSDataInjection
Back in September this year, I published a PowerShell DSC resource called cPowerSHellPackageManagement. This DSC resource allows you to manage PowerShell repositories and modules on any Windows machines running PowerShell version 5 and later. you can read more about this module from my previous post here: http://blog.tyang.org/2016/09/15/powershell-dsc-resource-for-managing-repositories-and-modules/
Couple of weeks ago my MVP buddy Alex Verkinderen had some issue using this DSC resource in Azure Automation DSC. After some investigation, I found there was a minor bug in the DSC resource. When you use this DSC resource to install modules, sometimes you may get an error like this:
Basically, it is complaining that a cmdlet from the module you are trying to install already exists. In order to fix it, I had to update the DSC resource and added –AllowClobber switch to the Install-Module cmdlet.
I have published the updated version to both PowerShell Gallery (https://www.powershellgallery.com/packages/cPowerShellPackageManagement/22.214.171.124) and GitHub (https://github.com/tyconsulting/PowerShellPackageManagementDSCResource/releases/tag/126.96.36.199)
If you are using this DSC resource at the moment, make sure you check out the update.
Currently in OMS, there are 3 assessment solutions for various Microsoft products. They are:
- Active Directory Assessment Solution
- SQL Server Assessment Solution
- SCOM Assessment Solution
Few days ago, I needed to export the assessment rules from each solution and handover to a customer (so they know exactly what areas are being assessed). So I developed the following queries to extract the details of the assessment rules:
AD Assessment Solution query:
Type=ADAssessmentRecommendation | Dedup Recommendation | select FocusArea,AffectedObjectType,Recommendation,Description | Sort FocusArea
SQL Server Assessment Solution query:
Type=SQLAssessmentRecommendation | Dedup Recommendation | select FocusArea,AffectedObjectType,Recommendation,Description | Sort FocusArea
SCOM Assessment Solution query:
Type=SCOMAssessmentRecommendation | Dedup Recommendation | select FocusArea,AffectedObjectType,Recommendation,Description | Sort FocusArea
In order to use these queries, you need to make sure these solutions are enabled and already collecting data. You may also need to change the search time window to at least last 7 days because by default, assessment solutions only run once a week.
Once you get the result in the OMS portal, you can easily export it to CSV file by hitting the Export button.
Over the last few days, I had an requirement injecting events from .evtx files into OMS Log Analytics. A typical .evtx file that I need to process contains over 140,000 events. Since the Azure Automation runbook have the maximum execution time of 3 hours, in order to make the runbook more efficient, I also had to update my OMSDataInjection PowerShell module to support bulk insert (http://blog.tyang.org/2016/12/05/omsdatainjection-powershell-module-updated/).
I have publish the runbook on GitHub Gist:
Note: In order to use this runbook, you MUST use the latest OMSDataInjection module (version 1.1.1) because of the bulk insert.
You will need to specify the following parameters:
- EvtExportPath – the file path (i.e. a SMB share) to the evtx file.
- OMSConnectionName – the name of the OMSWorkspace connection asset you have created previously. this connection is defined in the OMSDataInjection module
- OMSLogTypeName – The OMS log type name that you wish to use for the injected events.
- BatchLimit – the number of events been injected in a single bulk request. This is an optional parameter, the default value is 1000 if it is not specified.
- OMSTimeStampFieldName – For the OMS HTTP Data Collector API, you will need to tell the API which field in your log represent the timestamp. since all events extracted from .evtx files all have a “TimeCreated” field, the default value for this parameter is ‘TimeCreated’.
You can further customise the runbook and choose which fields from the evtx events that you wish to exclude. For the fields that you wish to exclude, you need to add them to the $arrSkippedProperties array variable (line 25 – 31). I have already pre-populated it with few obvious ones, you can add and remove them to suit your requirements.
Lastly, sometimes you will get events that their formatted description cannot be displayed. i.e.
When the runbook cannot get the formatted description of event, it will use the XML content as the event description instead.
Sample event injected by this runbook:
I’ve updated the OMSDataInjection PowerShell module to version 1.1.1. I have added support for bulk insert into OMS.
Now you can pass in an array of PSObject or plain JSON payload with multiple log entries. The module will check for the payload size and make sure it is below the supported limit of 30MB before inserting into OMS.
You can get the new version from both PowerShell Gallery and GitHub:
PowerShell Gallery: https://www.powershellgallery.com/packages/OMSDataInjection/1.1.1
Back in September, the Power BI team introduced the Forecasting preview feature in Power BI Desktop. I was really excited to see this highly demanded feature finally been made available. However, it was only a preview feature in Power BI Desktop, it was not available in Power BI online. Few days ago, when the Power BI November update was introduced, this feature has come out of preview and became available also on Power BI Online.
In the cloud and data centre management context, forecasting plays a very important role in capacity planning. Earlier this year, before the OMS Capacity Planning solution V1 has been taken off the shelve, I have written couple of posts comparing OMS Capacity Planning solution and OpsLogix OpsMgr Capacity Report MP, and OpsLogix Capacity Report MP overview. But ever since the OMS Capacity Planning solution was removed, at the moment, we don’t have a capacity planning solution for OMS data sources – the OpsLogix Capacity Report MP is 100% based on OpsMgr.
Power BI Forecasting Feature
When I read the Power BI November update announcement few days ago, I was really excited because the Forecasting feature is finally available on Power BI Online, which means I can use this feature on OMS data sources (such as performance data).
Since I already have configured OMS to pump data to Power BI, it only took me around 15 minutes and I have created an OMS Performance Forecasting report in Power BI:
I’m going to show you how to create this report in the remaining of this post.
01. Make sure you have already configured OMS to inject performance data (Type=Perf) to Power BI.
02. Download required Power BI custom visuals
In this report, I’m using two Power BI custom visuals that are not available out of the box, you will need to download the following from the Power BI Visuals Gallery:
- Hierarchy Slicer (https://app.powerbi.com/visuals/show/HierarchySlicer1458836712039)
- Timeline (https://app.powerbi.com/visuals/show/Timeline1447991079100)
Creating the report
01. Click on the data source for OMS perf data, you will see a blank canvas. firstly, import the above mentioned visuals to the report
02. Add a text box on the top of the report page for the report title
03. Add a Hierarchy Slicer
Configure the slicer to filter on the following fields (in the specific order):
and make sure Single Select on (default value). Optionally, give the visual a title:
04. Add a line chart to the centre of the report. Drag TimeGenerated field to Axis and CounterValue to Values. For CounterValues, choose the average value.
Give the visual a title.
Note: DO NOT configure the “Legend” field for the line chart visual, otherwise the forecasting feature will be disabled.
05. In the Analytics pane of the Line Chart visual, configure forecast based on your requirements
06. Optionally, also add a Trend Line
07. Add a Timeline visual to the bottom of the report page and drag the TimeGenerated field from the dataset to to the Time field of the visual.
In order to save the screen space, turn of Labels, and give the Timeline visual a title
08. Save the report. You can also ping this report page to a dashboard.
Using the Report
Now that the report is created, you can select a counter instance using from the Hierarchy Slicer, and chose a time window that you want the forecasting to be based on from the Timeline slicer. the data on the Line Chart visual will be automatically updated.
Comparing to the old OMS Capacity Planning Solution, what I demonstrated here only provides forecasting for individual performance counters. It does not analyse performance data in order to provide a high level overview like what the Capacity Planning solution did. However, since there is no forecasting capabilities in OMS at the moment, this provides a quick and easy way to give you some basic forecasting capabilities.
Firstly, apologies for not being able to blog for 6 weeks. I have been really busy lately. As part of a project that I’m working on, I have been dealing with Azure Table storage and its REST API over the last couple of weeks. I have written few Azure Function app in C# as well as some Azure Automation runbooks in PowerShell that involves inserting, querying and updating records (entities) in Azure tables. I was struggling a little bit during development of these function apps and runbooks because I couldn’t find too many good code examples and I personally believe this REST API is not well documented on Microsoft’s documentation site (https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/table-service-rest-api). Therefore I have spent the last two days developed a PowerShell module for managing the lifecycle of the Azure Table entities. This module can be used to perform CRUD (Create, Read, Update, Delete) operations for Azure Table entities.
AzureTableEntity PowerShell Module
This PowerShell module is named as AzureTableEntity, it can be located in both GitHub and PowerShell Gallery:
- GitHub: https://github.com/tyconsulting/AzureTableEntity-PowerShell-Module
- PowerShell Gallery: https://www.powershellgallery.com/packages/AzureTableEntity
This module offers the following 4 functions:
|Get-AzureTableEntity||Search Azure Table entities by specifying a search string.|
|New-AzureTableEntity||Insert one or more entities to Azure table storage.|
|Update-AzureTableEntity||Update one or more entities to Azure table storage.|
|Remove-AzureTableEntity||Remove one or more entities to Azure table storage.|
Note: All functions have been properly documented in the help file. you can use Get-Help cmdlet to access the help file.
By default when performing query operation, the Azure Table REST API only returns up to 1000 entities or all entities returned from search within 5 seconds. This function has a parameter ‘-GetAll’ that can be used to return all search results from a large table. The default value of this parameter is set to $true.
The search result returned by the search API is deserialised. As the result, complex data type such as datetime is returned as string. If you want any datetime fields from the search result returned as original datetime field, you can set the “-ConvertDateTimeFields” parameter to $true. Please note this would potentially increase the script execution time when dealing with a large set of search result.
Hint: You can easily build your search string using the Azure Storage Explorer.
This function can be used to insert a single entity or bulk insert up to 100 entities (and the total payload size is less than 4MB).
Please make sure both “PartitionKey” and “RowKey” are included in the entity. The data type for these fields must be string.
i.e. Instead of setting RowKey = 1, you should set RowKey = “1” – because the value for both PartitionKey and RowKey must be a string.
This function can be used to update a single entity or bulk update up to 100 entities (and the total payload size is less than 4MB).
Please note when updating an entity, all fields (including the fields that do not need to be updated) must be specified. It is actually a merge operation. If you are modifying an existing entity returned from the search operation (Get-AzureTableEntity) and the entity contains datetime fields, please make sure you set “-ConvertDateTimeFields” parameter to $true when performing the search in the first place. Please also be aware that the built-in Timestamp field must not be included in the entity fields.
This function can be used to remove a single entity or bulk remove up to 100 entities (and the total payload size is less than 4MB).
Support for Azure Automation and SMA
To simply leveraging this module in Azure Automation or SMA, I have included a connection object in the module:
Once you have created the connection objects, instead of specifying storage account, table name and storage account access key, you can simply specify the connection object using ‘-TableConnection’ parameter for all four functions.
I have published some sample code I wrote when developing this module to GitHub Gist:
I wrote this module so I can simplify my Azure Automation runbooks and make IT Pro’s life easier when working on Azure Table storage. If you have to deal with Azure Table storage, I hope you find this module useful. If you are a developer and looking for code samples, you can still use this module and simply translate the code to the language of your choice.
I purposely didn’t include any functions for managing the Azure table storage itself because you can manage the Table storage using the Azure.Storage module.
Lastly, feedbacks are always welcome, so please drop me an email if you have any.
Last month, I presented at the Melbourne Microsoft Cloud and Datacenter Meetup on the topic “Developing Your OWN OMS Solutions” (https://www.meetup.com/Melbourne-Microsoft-Cloud-and-Datacenter-Meetup/events/233154212/). I recorded the session but then realise the recording had some technical errors due to the change of screen resolution without restarting Camtasia. This morning, I re-recorded the session and uploaded to the Meetup’s YouTube Channel.
If you are interested, you can watch the recording here (https://www.youtube.com/watch?v=zUzI31iIcTk):
And you can also download the slide deck HERE.
Few days ago my good friend and fellow CDM MVP Alex Verkinderen (@AlexVerkinderen) had a requirement to produce a Power BI dashboard for Azure AD users. so Alex and I started discussing a way to produce such report in Power BI. After exploring various potential possibilities, we have decided to leverage Azure Functions to feed data into Power BI. You can check out the Power BI solution Alex has built on his blog here: http://www.mscloud.be/retrieve-azure-aad-user-information-with-azure-functions-and-publish-it-into-powerbi
In this blog post, I’m not going to the details of how the AAD Users Power BI report was built. Instead, I will focus on the Azure Functions component and briefly demonstrate how to build a Azure Functions web service and act as a Power BI data source. As an example for this post, I’ll build a Azure Functions web service in PowerShell that brings in Azure VMs information into Power BI. To set the stage, I have already written two blog posts yesterday on Azure Functions:
These two posts demonstrated two important steps that we need to prepare for the Azure Functions PowerShell code. We will need to follow these posts and prepare the following:
- Upload the latest AzureRM.Profile and AzureRM.Compute PowerShell modules to Azure Functions
- Encrypt the password for the service account to be used to access the Azure subscription.
Once done, we need to update the user name and the encrypted password in the code below (line 24 and 25)
I have configured the function authorization level to “Function” which means I need to pass an API key when invoking the function. I also need to pass the Azure subscription Id via the URL. To test, I’m using the Invoke-WebRequest cmdlet and see if I can retrieve the Azure VMs information:
$Request = (Invoke-WebRequest -Uri 'https://yourfunctionapp.azurewebsites.net/api/GetAzureVMs?code=xyzbe8da45lqedkh2fk31m4jep61aali&subscriptionId=2699bb49-076d-4f94-987e-a6a41ef17c3f' -UseBasicParsing -Method Get).content
As you can see, the request body content contains a HTML output which contains a table for the Azure VM information
Now that I’ve confirmed the function is working, all I need to do is to use Power BI to get the data from the web.
Note: I’m not going to too deep in Power BI in this post, therefore I will only demonstrate how to do so in Power BI desktop. However Alex’s post has covered how to configure such reports in Power BI Online and ensuring the data is always up-to-date by leveraging the On-Prem Data Gateway component. So, please make sure you also read Alex’s post when you are done with this one.
In Power BI Desktop, simply enter the URL with the basic setting:
and choose “Table 0”:
Once imported, you can see the all the properties I’ve defined in the Azure Functions PowerShell script has been imported in the dataset:
and I’ve used a table visual in the Power BI report and listed all the fields from the dataset:
Since the purpose of this post is only to demonstrate how to use Azure Functions as the data source for Power BI, I am only going to demonstrate how to get the data into Power BI. Creating fancy reports and dashbaords for Azure VM data is not what I intent to cover.
Now that the data is available in Power BI, you can be creative and design fancy reports using different Power BI visuals.
Note: The method described in this post may not work when you want to refresh your data after published your report to Power BI Online. You may need to use this C# Wrapper function: http://blog.tyang.org/2016/10/13/making-powershell-based-azure-functions-to-produce-html-outputs/. Alex has got this part covered in his post.
Lastly, make sure you go check out Alex’s post on how he created the AAD Users report using this method. As I mentioned, he has also covered two important aspects – how to make this report online (so you can share with other people) and how to make sure you data is always up to date by using the on-prem data gateway.