Tag Archives: MimbolovePower BI
Back in September, the Power BI team introduced the Forecasting preview feature in Power BI Desktop. I was really excited to see this highly demanded feature finally been made available. However, it was only a preview feature in Power BI Desktop, it was not available in Power BI online. Few days ago, when the Power BI November update was introduced, this feature has come out of preview and became available also on Power BI Online.
In the cloud and data centre management context, forecasting plays a very important role in capacity planning. Earlier this year, before the OMS Capacity Planning solution V1 has been taken off the shelve, I have written couple of posts comparing OMS Capacity Planning solution and OpsLogix OpsMgr Capacity Report MP, and OpsLogix Capacity Report MP overview. But ever since the OMS Capacity Planning solution was removed, at the moment, we don’t have a capacity planning solution for OMS data sources – the OpsLogix Capacity Report MP is 100% based on OpsMgr.
Power BI Forecasting Feature
When I read the Power BI November update announcement few days ago, I was really excited because the Forecasting feature is finally available on Power BI Online, which means I can use this feature on OMS data sources (such as performance data).
Since I already have configured OMS to pump data to Power BI, it only took me around 15 minutes and I have created an OMS Performance Forecasting report in Power BI:
I’m going to show you how to create this report in the remaining of this post.
01. Make sure you have already configured OMS to inject performance data (Type=Perf) to Power BI.
02. Download required Power BI custom visuals
In this report, I’m using two Power BI custom visuals that are not available out of the box, you will need to download the following from the Power BI Visuals Gallery:
- Hierarchy Slicer (https://app.powerbi.com/visuals/show/HierarchySlicer1458836712039)
- Timeline (https://app.powerbi.com/visuals/show/Timeline1447991079100)
Creating the report
01. Click on the data source for OMS perf data, you will see a blank canvas. firstly, import the above mentioned visuals to the report
02. Add a text box on the top of the report page for the report title
03. Add a Hierarchy Slicer
Configure the slicer to filter on the following fields (in the specific order):
and make sure Single Select on (default value). Optionally, give the visual a title:
04. Add a line chart to the centre of the report. Drag TimeGenerated field to Axis and CounterValue to Values. For CounterValues, choose the average value.
Give the visual a title.
Note: DO NOT configure the “Legend” field for the line chart visual, otherwise the forecasting feature will be disabled.
05. In the Analytics pane of the Line Chart visual, configure forecast based on your requirements
06. Optionally, also add a Trend Line
07. Add a Timeline visual to the bottom of the report page and drag the TimeGenerated field from the dataset to to the Time field of the visual.
In order to save the screen space, turn of Labels, and give the Timeline visual a title
08. Save the report. You can also ping this report page to a dashboard.
Using the Report
Now that the report is created, you can select a counter instance using from the Hierarchy Slicer, and chose a time window that you want the forecasting to be based on from the Timeline slicer. the data on the Line Chart visual will be automatically updated.
Comparing to the old OMS Capacity Planning Solution, what I demonstrated here only provides forecasting for individual performance counters. It does not analyse performance data in order to provide a high level overview like what the Capacity Planning solution did. However, since there is no forecasting capabilities in OMS at the moment, this provides a quick and easy way to give you some basic forecasting capabilities.
Few days ago my good friend and fellow CDM MVP Alex Verkinderen (@AlexVerkinderen) had a requirement to produce a Power BI dashboard for Azure AD users. so Alex and I started discussing a way to produce such report in Power BI. After exploring various potential possibilities, we have decided to leverage Azure Functions to feed data into Power BI. You can check out the Power BI solution Alex has built on his blog here: http://www.mscloud.be/retrieve-azure-aad-user-information-with-azure-functions-and-publish-it-into-powerbi
In this blog post, I’m not going to the details of how the AAD Users Power BI report was built. Instead, I will focus on the Azure Functions component and briefly demonstrate how to build a Azure Functions web service and act as a Power BI data source. As an example for this post, I’ll build a Azure Functions web service in PowerShell that brings in Azure VMs information into Power BI. To set the stage, I have already written two blog posts yesterday on Azure Functions:
These two posts demonstrated two important steps that we need to prepare for the Azure Functions PowerShell code. We will need to follow these posts and prepare the following:
- Upload the latest AzureRM.Profile and AzureRM.Compute PowerShell modules to Azure Functions
- Encrypt the password for the service account to be used to access the Azure subscription.
Once done, we need to update the user name and the encrypted password in the code below (line 24 and 25)
I have configured the function authorization level to “Function” which means I need to pass an API key when invoking the function. I also need to pass the Azure subscription Id via the URL. To test, I’m using the Invoke-WebRequest cmdlet and see if I can retrieve the Azure VMs information:
$Request = (Invoke-WebRequest -Uri 'https://yourfunctionapp.azurewebsites.net/api/GetAzureVMs?code=xyzbe8da45lqedkh2fk31m4jep61aali&subscriptionId=2699bb49-076d-4f94-987e-a6a41ef17c3f' -UseBasicParsing -Method Get).content
As you can see, the request body content contains a HTML output which contains a table for the Azure VM information
Now that I’ve confirmed the function is working, all I need to do is to use Power BI to get the data from the web.
Note: I’m not going to too deep in Power BI in this post, therefore I will only demonstrate how to do so in Power BI desktop. However Alex’s post has covered how to configure such reports in Power BI Online and ensuring the data is always up-to-date by leveraging the On-Prem Data Gateway component. So, please make sure you also read Alex’s post when you are done with this one.
In Power BI Desktop, simply enter the URL with the basic setting:
and choose “Table 0”:
Once imported, you can see the all the properties I’ve defined in the Azure Functions PowerShell script has been imported in the dataset:
and I’ve used a table visual in the Power BI report and listed all the fields from the dataset:
Since the purpose of this post is only to demonstrate how to use Azure Functions as the data source for Power BI, I am only going to demonstrate how to get the data into Power BI. Creating fancy reports and dashbaords for Azure VM data is not what I intent to cover.
Now that the data is available in Power BI, you can be creative and design fancy reports using different Power BI visuals.
Note: The method described in this post may not work when you want to refresh your data after published your report to Power BI Online. You may need to use this C# Wrapper function: http://blog.tyang.org/2016/10/13/making-powershell-based-azure-functions-to-produce-html-outputs/. Alex has got this part covered in his post.
Lastly, make sure you go check out Alex’s post on how he created the AAD Users report using this method. As I mentioned, he has also covered two important aspects – how to make this report online (so you can share with other people) and how to make sure you data is always up to date by using the on-prem data gateway.
I’ve been playing with the OMS Network Performance Monitor (NPM) today. Earlier today, I’ve released an OpsMgr MP that contains tasks to configure MMA agent for NPM. You can find the post here: http://blog.tyang.org/2016/08/22/opsmgr-agent-task-to-configure-oms-network-performance-monitor-agents/
The other thing I wanted to do is to create a Power BI dashboard for the data collected by OMS NPM solution. The data collected by NPM can be retrieved using OMS search query “Type=NetworkMonitoring”.
To begin my experiment, I created a Power BI schedule in OMS using above mentioned query and waited a while for the data to populate in Power BI
I then used 2 custom visuals from the Power BI Custom Visual Gallery:
01. Force-Directed Graph
and I created an interactive report that displays the network topology based on the NPM data:
In this report, I’m using a built-in slicer (top left) visual to filter source computers and the timeline visual (bottom) to filter time windows. The main section (top right) consists of a Force-Directed Graph visual, which is used to draw the network topology diagram.
I can choose one or more source computers from the slicer, and choose a time window from the timeline visual located at the bottom.
On the network topology (Force-Directed Graph visual), the arrow represents the direction of the traffic, thickness represents the median network latency (thicker = higher latency), and the link colour represents the network loss health state determined by the OMS NPM solution (LossHealthState).
I will now explain the steps I’ve taken to create this Power BI report:
01. Create a blank report based on the OMS NPM dataset (that you’ve created from the OMS portal earlier).
02. Create a Page Level Filter based on the SubType Field, and only select “NetworkPath”.
03. Add the Slicer visual to the top left and configure it as shown below:
04. Add the Force-Directed Graph (ForceGraph) to the main section of the report (top right), and configure it as shown below:
- Source – SourceNetworkNodeInterface
- Target – DestinationNetworkNodeInterface
- Weight – Average of MedianLatency
- Link Type – LossHealthState
- Data labels – On
- Arrow – On
- Label – On
- Color – By Link Type
- Thickness – On
- Max name length – 15
- Size – change to a value that suits you the best
05. Add a timeline visual to the bottom of the report, then drag the TimeGenerated Field from the dataset to the Time field:
As you can see, as long as you understand what each field means in the OMS data type that you are interested in, it’s really easy to create cool Power BI reports, as long as you are using appropriate visuals. This is all I have to share today, until next time, have fun in OMS and Power BI!
Few days ago, the OMS product team has announced the OMS Agent Heartbeat capability. If you haven’t read about it, you can find the post here: https://blogs.technet.microsoft.com/msoms/2016/08/16/view-your-agent-health-in-oms-2/. In this post, Nini, the PM for the agent heartbeat feature explained how to create custom views within OMS portal to visualize the agent heartbeat data. Funny that I also started working on something similar around the same time, but instead of creating visual presentations within OMS, I did it Power BI. I managed to create couple of Power BI reports for the OMS agent heartbeat, using both native and custom Power BI Visuals:
01. Agent Locations Map Report:
Since the agent heartbeat data contains the geo location of the agent IP address, I’ve created this report to map the physical location the agent on an interactive map.
02. Agent Statistics Report:
This report has several parts, it contains the following parts:
- A heat map based on the country where the agent is located (Agent Location by Country). The colour highlighting the country changes based on the agent count.
- An interactive “fish tank” visual. In this visual, each fish represent an OMS agent. the size of the fish presents number of heartbeats generated by the agent. So, the older the agent (fish) is, the more heartbeat will be generated to the OMS workspace (fish tank), and the bigger the fish will become.
- A Brick chart shows the percentage (this chart contains 100 tiles) of the agent by OS type (Linux vs Windows).
- A tornado chart shows agent distribution by country. Agent OS type is also separated in different colours.
- A Pie Chart shows agent distribution by management groups (SCOM attached vs direct attached vs Linux agents)
- Agent version Donut chart that separates agent counts by agent version numbers (both Windows agents and Linux agents).
The fish visual is called “Enlighten Aquarium”, as you can see below, it’s an animated visual.
In this blog post, I will walk through the steps of creating these reports.
Before we create these reports, you need to make sure:
01. Power BI account
You will need to have a Power BI account (either a free or pro account) so OMS can inject data into your Power BI workspace.
02. Power BI preview feature is enabled in OMS
At the time of writing this post, the Power BI integration feature in OMS is still under public preview. Therefore if you haven’t done so, you will need to manually enable this feature first. To do so, go to the “Preview Features” tab in the OMS settings page, and enable “Power BI Integration”:
03. Connect your OMS workspace to your Power BI workspace.
Once the Power BI Integration feature is enabled, you need to connect OMS to Power BI. This is achieved by providing the Power BI account credential in the “Accounts” tab of the OMS settings page:
04. Setting up Power BI injection schedules
We need to inject the OMS agent heartbeat data to Power BI. We can just use a simple query: “Type=Heartbeat”, and set the schedule to run every 15 minutes:
05. Wait 15 – 30 minutes
You will have to wait a while before you can see the data in Power BI.
06. Download Power BI Custom visual
Since these reports use number of custom Power BI visuals, you will need to download them to your local computer first, and then import them into the reports when you start creating the reports. To download custom visuals, go to the Power BI Visuals Gallery (https://app.powerbi.com/visuals/) and download the following visuals:
- Brick Chart
- Hierarchy Slicer
- Donut Chart GMO
- Tornado Chart
- Enlighten Aquarium
To start creating the report, firstly logon to Power BI using the account you’ve used to make the connection in OMS, and then find the Dataset you have specified. in this post, I’ve created a dataset called “OMS – Agent Heartbeat”. By clicking on the dataset, you will be presented to an empty report:
You will then need to import the custom visuals – by clicking on the “…” icon under Visualizations, and select “Import a custom visual”
You can only import one at a time, so please repeat this process and import all the custom visuals I have listed above.
Creating Agent Location Report
For the Agent location report, we will add 3 visuals:
- Hierarchy Slicer – for filtering IP addresses and computer names
- Map – for pinpointing the agent location
- Timeline – for filtering the time windows
Agent Filter (Hierarchy Slicer)
Add a Hierarchy slicer and place on the left side of the page, then drag the ComputerIP and Computer fields from to the “Fields” section, please make sure you place ComputerIP on top of Computer:
it’s also a good idea to turn off single selection for the hierarchy slicer so you can select multiple items:
Agent Location Map
Add the Map visual to the report, configure it as listed below:
- Location – RemoteIPCountry
- Legend – Computer
- Latitude – Average of Remote IPLatitude
- Longitude – Average of RemoteIPLongitude
- Size – Count of Computer (Distinct)
Note: since the latitude and longitude shouldn’t change between different records for the same computer as long as the IP doesn’t change, so it doesn’t matter if you use average, or maximum or minimum, the result of each calculation should be the same.
Time Slicer (Timeline)
Add a timeline slicer to the bottom of the report page, configure it to use the TimeGenerated field:
To same some space on the report page, you may also turn off the labels for the timeline slicer:
Lastly, add a text box on the top of the report page, give it a title, also if you want to, assign each visual a title by highlighting the visual, then click on Format icon, and update the title field:
To use this report, you can make your selections in the hierarchy slicer and the timeline slicer. The map will be automatically updated.
Create Agent Statistic Report
For the second report, you can create a new page of the existing report, or create a brand new report based on the same dataset. We will use the following visuals in this report:
- Filled Map
- Brick Chart
- Tornado Chart
- Pie Chart
Agent Location By Country (Filled Map)
Configure the Filled Map visual as shown below:
OMS Agent By Heartbeat Count (Aquarium)
Configure the Aquarium visual as shown below:
Agent OS Type (Brick Chart)
Configure the Brick Chart visual as shown below:
Agent Distribution By Country (Tornado Chart)
Configure the Tornado Chart as shown below:
Agent Distribution By Management Groups (Pie Chart)
Configure the Pie Chart as shown below:
Agent Version (DonutChartGMO)
Configure the DonutChartGMO visual as shown below:
and change the Primary Measure under Legend to “Value” / “Percentage” / “Both”, whichever you prefer:
Most of the visuals used by this report are interactive. i.e. if I click on a section in the Agent Version DonutChartGMO visual, other visuals will be automatically updated to reflect the selection I made in the DonutChartGMO visual.
Once you’ve configured all the visuals, please make sure you save your report.
There are many things you can do with the Power BI reports you’ve just created. i.e. you can share it with other people, ping individual visuals or entire report to a dashboard, or create an Iframe link and embed the report to 3rd party systems that support IFrame (i.e. SharePoint sites). We are not going to get into details of how to consume these reports today.
Please note that during my testing, the RemoteIPLatitude and RemoteIPLongitude data from the heartbeat events are not very accurate for the computers in my lab. I’m based in Melbourne, Australia but the map coordinates pinged to a location in Sydney, which is over 1000km away from me.
Please also be aware that for SCOM attached agents, each time when the agent sends heartbeat, it will send 2 heartbeats via different channels. This behaviour is by design – my good friend and fellow CDM MVP Stanislav Zhelyazkov(@StanZhelyazkov) has explained this in his blog post: https://cloudadministrator.wordpress.com/2016/08/17/double-heartbeat-events-in-oms-log-analytics/
This is all I have to share for today. until next time, have fun with OMS and Power BI!
when working on a dashboard or a portal, sometimes it is good that the portal is more interactive. I often found it’s more useful then just a static widget. Since I come from the monitoring back ground, I’ll use performance data as an example.
In the good old SCOM, we have this awesome 3rd party web portal called Squared Up, which allows you to choose the time frame for the perf graph:
and you can also select the time frame by highlighting a section from the graph itself:
In OMS, when we are playing with the Near Real-Time (NRT) Performance data (Type=Perf), we also have the options to specify the time frame of our choice:
Additionally, if we have chosen a time scope that is 6 hours or less, we are able to see the raw NRT perf data coming in every few seconds (in light blue colour):
Both Squared Up (for SCOM) and OMS portal provides very interactive ways to consume the perf data.
As we all know, OMS has the ability to send collected data to Power BI, therefore we are also able to create Power BI reports that contains performance data injected by OMS. i.e.:
As you can see, with the Power BI Line Chart visual, we can even add a trend line (the black dotted line), which is very nice in my opinion. However, by using native visuals, there are few limitations with displaying performance data in Power BI:
- The time frame cannot be easily scoped
- The computer and performance counters cannot be easily scoped
What I mean is, you can absolutely create a filters on either visual level, or page level or even the report level to create desired scopes – just like what I did in the example above:
But these filters are rather static. You won’t be able to alter them once you’ve saved the report. Obviously, as the report creator, you don’t really want to multiple almost identical visuals for different counters for different computers. In my opinion, reports like these become less interactive and user friendly because they are too static.
So, how do we make these Power BI reports more interactive? there are few options:
1. Use a Slicer to filter the computers OR the counters
In Power BI, you can add a slicer to your page. Slicers makes the report more interactive. users can choose one or more items from the slicer and other visuals on the page will be updated automatically based on users selection.
In the above example, I’ve used page level filter to only display the ‘Availability MBytes’ counter, and users can use the slicer to choose the computers they are interested in.
This solution is easy to implement, it may satisfy the requirements if you are only interested in a specific counter from a long term trend point of view – since we are not filtering the time windows, it will display the entire period that is available in Power BI.
2. Use the Custom Visual ‘Hierarchy Slicer’ to filter the computers AND the counters
For Power BI, you can download custom visuals from https://app.powerbi.com/visuals/?WT.mc_id=Blog_CustomVisuals and then import into your reports.
One of the custom visual you can download is called Hierarchy Slicer:
As the name suggests, comparing to the original built-in slicer, this visual allows you to build a hierarchy for your slicers:
As you can see, I’ve added Computer name as the top level filter in the hierarchy slicer, followed by the counter name as the second level in the slicer. As the result, I don’t have to use the filters for this page. Users can simply click on a counter (2nd level) to view the graph for the counter on that specific computer, or select a computer (1st level) to see all the perf data for that particular computer. Obviously, you can make the counter name as the top of the hierarchy and place the computer name as the second level if that suits your needs better.
Note: As per introduction video for this visual, you can enable multi-select by configuring the visual and turn off the ‘Single Select’ option:
However, based on my experience, this option is only available when you are using Power BI Desktop. It is not available in Power BI Online.
Therefore we won’t be able to use multi-select for the OMS injected data because we cannot use Power BI Desktop with OMS data.
3. Use the Brush Chart custom visual to scope the time frame
Another cool custom visual is called Brush Chart, it is also called ‘Advanced Time Slicer’ on the download page:
I am using this together with the hierarchy slicer, so I can scope both computers and counters, as well as the perf data time window.
As you can see, there are 2 graphs on this visual. I can use mouse (or other pointing devices) to select a time window from the bottom graph, and the top graph will be automatically zoomed into the selected time period.
4. Use the Time Brush Custom Visual to scope the time frame
The Time Brush custom visual is very similar to the Brush Chart (aka Advanced Time Slicer).
It cannot be used by itself, it acts as the control for other visuals. in the example below, I’m using it together with the Line Chart visual, as well as the hierarchy slicer:
As you can see, when I select a period from the Time Brush visual, the line chart got updated automatically.
5. use other custom visuals
There are a lot of other custom visuals that you can download. for example, there’s another time slicer called TimeLine that allows you specify a precise, specific time frame.
By using the combination of various slicers, we can produce more interactive and user friendly reports in Power BI. In the examples listed above, I can quickly produce a single report for ALL the OMS performance data, and users can simply choose the computer, counter and the time frame from the report itself. There is no need to create separate reports for different counters or computers.
I hope you find these tips useful, and have fun with OMS and Power BI!
Few days ago, my friend and CDM MVP Cameron Fuller published a great article on how to build Power BI Dashboards for OpsMgr. You can check out Cameron’s post from here: http://blogs.catapultsystems.com/cfuller/archive/2015/12/01/using-power-bi-for-disk-space-dashboards-and-reports-in-operations-manager/
The solution Cameron produced was based on Power BI desktop and OpsMgr Data Warehouse DB, which both are located in your on-premises network. After Cameron has shown us what he has produced, I spent some time, and managed to extend the reports and dashboards that Cameron has created using Power BI Desktop to Power BI sites, which is a cloud-based PaaS solution offered as a part of the Office 365.
In this post, I will go through the process of setting up this solution so you can move Cameron’s Power BI dashboard to the cloud.
In order to create a Power BI dataset in your cloud based Power BI sites which is based on the on-prem OpsMgr Data Warehouse DB, we will need to install a component called Power BI Enterprise Gateway (Currently in preview) on a server in your on-prem data center. As shown in the diagram below, once the dataset is created for the OpsMgr Data Warehouse DB, the Power BI Site will query the OpsMgr DW DB through the Power BI Enterprise Gateway.
The solution we are going to implement requires the following pre-requisites:
- A on-prem server for hosting the Power BI Enterprise Gateway
- Power BI Desktop installed on an on-prem computer (i.e. your own PC)
- A Power BI Pro account
- A service account that has access to the OpsMgr data warehouse DB
Power BI Enterprise Gateway is a feature only available for Power BI Pro accounts. Please refer to this page for differences between Power BI Free and Pro accounts.
The install process for the Power BI Enterprise Gateway is very straightforward. It is documented here: https://powerbi.microsoft.com/en-us/documentation/powerbi-gateway-enterprise/
Once it is installed and connected to your Power BI account, you will be able to manage the gateway after you’ve logged on to the Power BI Site (https://app.powerbi.com/):
And you can use “Add Data Source” to establish connection to a data base via the gateway:
I have created a data source for my OpsMgr DW DB:
After the data source is created for the OpsMgr DW DB, I then need to create a dataset based on the OpsMgr DW DB. We must create this dataset in Power BI Desktop.
As Cameron already demonstrated in his post, in Power BI Desktop, we create a new document, and select SQL Server Database:
We then specify the SQL server name and database name for the OpsMgr DW DB:
Please make sure the server name and database name entered here is IDENTICAL as what you have entered for the OpsMgr DW DB Data Source you have created under the Power BI Enterprise Gateway.
Then, we will select all the tables and views that we are interested in. I won’t repeat what Cameron has already demonstrated, Please refer to his post for more details on what tables and views to select.
Please make sure you have select ALL the tables and views that you need. Once we have uploaded this configuration to Power BI Site, we won’t be able to modify this dataset in Power BI Site.
Once we have selected all required tables and views, click “Load”:
Please make sure you choose “DirectQuery” when prompted:
Power BI will then create the connections to each table and view you have selected. Depending on the number of tables and views you have selected, this process may take a while:
You can now see all the tables and views you have selected in Power BI Desktop:
Now, we can save this empty Power BI report:
After it is saved, we can go back to the Power BI Site, and import the newly created report by clicking “Get Data”:
Then choose “Files:
Select Local File, and choose the empty report we’ve just created in Power BI Desktop
Then, you should be able to the new dataset appeared under “Datasets”:
And we can create a new report in Power BI Site, same way as what Cameron has demonstrated in his post:
In my lab, I have created the dataset and imported the Power BI report file to Power BI Site few days ago. From the above screenshot, you can see the performance data collected today (as highlighted in red). This means the Power BI Site is directly quering my On-Prem OpsMgr DW DB in real time via the Power BI Enterprise gateway. And when I checked my usage, the OpsMgr DW DB dataset has only consumed 1MB of data:
In this post, I have demonstrated how to extend the PowerBI report and dashboard you have build in PowerBI Desktop located in your On-Prem environments to the cloud based version – Power BI Site, via the newly released Power BI Enterprise Gateway (still in preview at the time of writing).
Although we must have a Power BI Pro account in order to leverage the Power BI gateways, since are using direct query method when connecting to the OpsMgr DW database, this solution should not consume too much data from your monthly allowrance for the PRO user.
The performance for the report is really fast. My OpsMgr management group and the Power BI Enterprise Gateway server is located in my home lab, which is connected to the Internet only via an ADSL 2+ connection.
This is certainly a cheaper alternative OpsMgr dashboarding solution (compared with other commercial products). Additionally, since it is hosted on the cloud, it is much easier to access the reports and dashboard no matter where you are. Power BI also provides mobile apps for all 3 platforms (Windows, iOS and Android), which you can use to access the data using your preferred mobile devices. You can find more information about the mobile apps here: https://powerbi.microsoft.com/en-us/mobile
You also have the ability to share the dashboards you have created with other people within your organisation.