Author Archives: Tao Yang

SMA Runbook: Update A SharePoint 2013 List Item

Written by Tao Yang

Background

This blog hasn’t been too active lately. I’ve been spending a lot of time learning the new member in System Center family: Service Management Automation.

Yesterday, I needed a SMA runbook to update SharePoint 2013 list items, I found a sample from a blog post by Christian Booth, which contains a SMA runbook written by Ryan Andorfer, a System Center Cloud and Datacenter MVP.  Looks Ryan’s code was written for SharePoint 2010, which does not work for SharePoint 2013 because the SharePoint REST API has been updated. So I have spent some time, learned a bit more about SharePoint 2013’s REST API, and developed a new runbook for SharePoint 2013 based on Ryan’s code.

PowerShell Code

Here’s the finished work:

Unlike Ryan’s code, which also monitors the SP list, my runbook ONLY updates a specific list item.

Pre-Requisite and Parameters

Prior to using this runbook, you will need to save a credential in SMA which has  access to the SharePoint site

SNAGHTML7ebadf0

The runbook is expecting the following parameters:

SharePointSiteURL: The URL to the sharepoint site. i.e. http://SharepointServer/Sites/DemoSite

SavedCredentialName: name of the saved credential to connect to SharePoint site

ListName: Name of the list. i.e. “Test List”

ListItemID: the ID for the list item that the runbook is going to update

PropertyName: the field / property of the item that is going to be updated.

PropertyValue: the new value that is going to be set to the list item property.

Note: The list Item ID is the reference number for the item within the list. If you point the mouse cursor to the item, you will find the list item ID in the URL.

SNAGHTML90254b7

Putting it into Test:

To test, I’ve created a new list as shown in the above screenshot, I have kicked off the runbook with the the following parameters:

image

image

Here’s the result:

SNAGHTML8fd5d4a

SNAGHTML8fed515

Using It Together With Orchestrator SharePoint IP

Since this SMA runbook requires the List Item ID to locate the specific list item, when you design your solution, you will need to find a way to retrieve this parameter prior to calling this runbook.

If you are also using SC Orchestrator and have deployed the SharePoint IP, you can use the “Monitor List Items” activity, and the List Item ID is published by this activity:

image

Conclusion

Although I’m still a newbie when comes to SMA, it got me really excited. Before its time, when I design Orchestrator runbooks, I often ended up just write the entire solution in PowerShell and then chopped up my PowerShell scripts into many “Run .Net Script” activities. I thought, wouldn’t it be nice if there is an automation engine that only uses PowerShell? Well, looks like SMA is the solution. I wish I have started using it sooner.

If you are like me and want to learn more about this product, i highly recommend you to read the Service Management Automation Whitepaper (currently version 1.0.4) from my fellow SCCDM MVP Michael Rueefli. I have read it page by page like a bible!

OpsMgr Dashboard Fun: Server Details Using SquaredUp

Written by Tao Yang

After my previous post on how to create a performance view using SquaredUp, the founder of SquaredUp, Richard Benwell told me that I can also use “&embed=true” parameter in the URL to get rid of the headers. I also managed to create another widget to display server details. Combined with the performance view, I create a dashboard like this:

image

The bottom left is the improved version of the performance view (using embed parameter), and the right pane is the server details page:

SNAGHTML2d6fce9d

This server detail view contains the following information:

  • Alerts associated to the computer
  • Health states of the Distributed Apps that this computer is a part of.
  • Health State of its hosted components (Equivalent to the Health Explorer??)
  • Discovered properties of this computer

Combined with the performance view, it gives a good overview of the current state of the computer from different angles.

Here’s the script for this server detail view:

And here’s the script for the improved performance view (with “&embed=true” parameter):

I’d also like to clarify that my examples are just providing alternative ways to utilise SquaredUp and display useful information on a single pane of glass (dashboards).  I don’t want to mislead the readers of article to have an impression that SquaredUp relies on native OpsMgr consoles and dashboards. In my opinion and experience with SquaredUp, I think it is a perfect replacement to the built-in OpsMgr web console.

OpsMgr Dashboard Fun: Performance Widget Using SquaredUp

Written by Tao Yang

I’m a big fan of SquaredUp Dashboard. I have implemented it for my current “day-time” employer Coles over a year ago on their OpsMgr 2007 environments and we have also included SquaredUp in the newly built 2012 R2 management groups. In my opinion, it is more flexible than the native web console as it uses HTML 5 rather than Silverlight and it runs on any browsers as well as mobile devices.

One of my favourite features is that SquaredUp has the capability to directly read data from the OpsMgr Data Warehouse DB. Traditionally, OpsMgr operators would have to run or schedule reports in order to access aged performance data. Based on my experience, I think in 9 out of 10 times, it’s a total waste of my time, people don’t even open those reports when they arrived in their inboxes. With SquaredUp, you can access the performance data for any given period as long as it’s within the retention period. – So I can direct users to access these data from SquaredUp whenever they want, without having me involved.

I had some spare time today so I have installed the latest version in my home lab today. And I managed to create a dashboard using the PowerShell Web Browser widget for less than 10 minutes:

image

This dashboard contains 2 widgets. the left one is a state widget targeting Windows Server class. the widget on the right is a PowerShell Web Browser widget which has been made available since OpsMgr 2012 SP1 UR6 and SP2 UR2.

The script behind this widget is very simple. If you access the performance data of a server. the monitoring object ID and the timeframe are variables as part of the URL. so all I did is to pass these 2 variables. In this sample, I used the default timeframe of last 12 hours. you can specify other values if you like.

image

And here’s the script:

Additionally, in order to make SquaredUp work in this dashboard, I had to configure the Data Warehouse DB connection and enable Single Sign-On according to the instructions below:

 

If you haven’t played with SquaredUp yet, please have take a look at their website: www.squaredup.com. there’s an online demo you can access too.

Sparq Consulting

Written by Tao Yang

Sparq-logo

If you are actively involved in the System Center community, you may have heard and subscribed to the Inside Podcast Network (http://ipn.tv). I have known the host of IPN and my fellow System Center Cloud and Datacenter Management MVP, Dan Kregor for many years now (7 to be precise).

Dan and I have previously worked in the same team before he left Australia for the UK back in 2008. Since Dan moved back to Melbourne 2 years ago, we’ve been thinking about working together again. Now, after I have spent the last 18 month working on project that designed and implemented one of the largest System Center 2012 infrastructures in the country / region, I finally had time to sit down and plan for the future of my professional career.

After some thorough considerations and conversations with Dan, we have decided to partner up and start our own consulting firm. We’ve named our new firm Sparq Consulting (http://sparqconsulting.com.au).  We are offering a range of services to our customers around Microsoft System Center technologies, such as professional / consulting services, training and management packs development.

If you are a regular visitor to my blog, hopefully you’d have a rough idea about my capabilities. Dan and I have very similar skillsets. He has worked for several very well-known consulting firms in the past – I won’t mention the names and details, but if you are interested, please look him up on LinkedIn.

We would love to know about any potential opportunities that your organisation may have, whatever it maybe. If you think we could be of your help, please feel free to contact us. My Sparq Consulting’s email address is Tao [dot] Yang [At] sparqconsulting.com.au

Lastly, we have also started a new blog: http://sparqconsulting.com.au/blog/.  From now on, I will also start cross blogging on this new blog.

How to Create a PowerShell Console Profile Baseline for the Entire Environment

Written by Tao Yang

Background

Often when I’m working in my lab, I get frustrated because the code in PowerShell profiles varies between different computers and user accounts. And your user profile is also different between the normal PowerShell command console and PowerShell ISE. I wanted to be able to create a baseline for the PowerShell profiles across all computers and all users, no matter which PowerShell console is being used (normal command console vs PowerShell ISE).

For example, I would like to achieve the following when I start any 64 bit PowerShell consoles on any computers in my lab under any user accounts:

This is what I want the consoles to look like:

SNAGHTML65445bb.png

image.png

Although I can manually copy the code into the profiles for each of my user accounts and enable roaming profile for  these users, I don’t want to take this approach because it’s too manual and I am not a big fan of roaming profiles.

Instructions

My approach is incredibly simple, all I had to do is to create a simple script and deployed it as a normal software package  using ConfigMgr. I’ll now go through the steps.

All Users All Hosts Profile

Firstly, there are actually not one (1), but six (6) different PowerShell profiles (I have to admit, I didn’t know this until now Smile with tongue out). This article from the Scripting Guy explained it very well. Based on this article, I have identified that I need to work on the All Users All Hosts profile. Because I want the code to run regardless which user account am I using, and no matter whether I’m using the normal command console or PowerShell ISE.

Pre-Requisite

As I mentioned previously, because I want to use the PSConsole module I have developed earlier, I need to make sure this module is deployed to all computers in my lab. To do so, I have created a simple msi to copy the module to the PowerShell Module’s folder and deployed it to all the computers using ConfigMgr. I won’t go through how I created the msi here.

Code Inside the All Users All Hosts profile

The All Users All Hosts profile is located at $PsHome\profile.ps1

image

Here’s the code I’ve added to this profile:

if (Get-module -name PSConsole -List)
{
Import-Module PSConsole
}

$host.UI.RawUI.BackgroundColor = "Black"
$host.UI.RawUI.ForegroundColor = "Green"
$host.UI.RawUI.WindowTitle = $host.UI.RawUI.WindowTitle + "  - Tao Yang Test Lab"
If ($psISE)
{
$psISE.Options.ConsolePaneBackgroundColor = "Black"
} else {
Resize-Console -max -ErrorAction SilentlyContinue
}
set-location C:\
Clear-Host

Note: The $psISE variable only exists in the PowerShell ISE environment, therefore I’m using it to identify which console am I currently in and used an IF… Else… statement to control what’s getting executed within PowerShell ISE and normal PowerShell console.

Script To create All Users All Hosts Profile

Next, I have created a PowerShell script to create the All Users All Hosts profile:

#=====================================================================
# Script Name:        CreateAllUsersAllHostsProfile.ps1
# DATE:               03/08/2014
# Version:            1.0
# COMMENT:            - Script to create All users All hosts PS profile
#=====================================================================

$ProfilePath = $profile.AllUsersAllHosts

#Create the profile if doesn't exist
If (!(test-path $ProfilePath))
{
New-Item -Path $ProfilePath -ItemType file -Force
}

#content of the profile script
$ProfileContent = @"
if (Get-module -name PSConsole -List)
{
Import-Module PSConsole
}

<code>$host.UI.RawUI.BackgroundColor = &quot;Black&quot;
</code>$host.UI.RawUI.ForegroundColor = &quot;Green&quot;
<code>$host.UI.RawUI.WindowTitle = </code>$host.UI.RawUI.WindowTitle + &quot;  - Tao Yang Test Lab&quot;
If (<code>$psISE)
{
</code>$psISE.Options.ConsolePaneBackgroundColor = &quot;Black&quot;
} else {
Resize-Console -max -ErrorAction SilentlyContinue
}
set-location C:\
Clear-Host
&quot;@
#write contents to the profile
if (test-path $ProfilePath)
{
Set-Content -Path $ProfilePath -Value $ProfileContent -Force
} else {
Write-Error &quot;All Users All Hosts PS Profile does not exist and this script failed to create it.&quot;
}

As you can see, I have stored the content in a multi-line string variable. The only thing to pay attention to is that I have to add the PowerShell escape character backtick (`)  in front of each variable (dollar sign $).

This script will overwrite the profile if already exists, so it will make sure the profile is consistent across all computers.

Deploy the Profile Creation Script Using ConfigMgr

In SCCM, I have created a Package with one program for this script:

image

Command Line: %windir%\Sysnative\WindowsPowerShell\v1.0\Powershell.exe .\CreateAllUsersAllHostsProfile.ps1

Note: I’m using ConfigMgr 2012 R2 in my lab, although the ConfigMgr client seems to be 64-bit, this command will still be executed under 32-bit environment. Therefore I have to use “Sysnative” instead of “System32” to overcome 32-bit redirection in 64-bit OS.

I created a re-occurring deployment for this program:

image

I’ve set it to run it once a day at 8:00am and always rerun.

Conclusion

This is an example on how we can standardise the baseline of PowerShell consoles within the environment. Individual users will still be able to add the users specific stuff in different profiles.

For example, on one of my computers, I have added one line to the default Current User Current Host profile:

image

In the All Users All Hosts profile, I have set the location to C:\, but in the Current User Current Host profile, I’ve set the location to “C:\Scripts\Backup Script”. The result is, when I started the console, the location is set to “C:\Scripts\Backup Script”. Obviously the Current User Current Host profile was executed after the All Users All Hosts profile. Therefore we can use the All Users All Hosts profile as a baseline and using Current User Current Host profile as a delta Smile.

This Blog Gets A Major Facelift

Written by Tao Yang

I have kept the same theme on this WordPress blog since day 1. It has been 4 years and I started getting sick of it. Especially that picture of an old iPhone on the top of the page. I finally got around to update the theme today.

I’ve also changed the site title to a more suitable one: “Tao Yang’s System Center Blog”.

Special thanks to my wife – The background picture was taken by her using her Nikon D90 in Fiji few years ago.

Bye bye to the old look,

image

Sometimes I wish there are more artistic gene in me. I’m still not 100% satisfied with the look, but this is the best I can do for now.

An Alternative for Surface Pro Docking Stations

Written by Tao Yang

I bought my Surface Pro 2 last November – third week after it was released in Australia. I only got it on the third week because I was on holidays in China when it was released and all the resellers ran out of stock when I came back.

I also bought a type cover 2 the same time. I really wanted to get the power cover and the docking station, but they weren’t released back then. I thought I’d get the type cover for now and get the power cover and the docking station when they became available in Australia.

Guess what, I was still waiting when Microsoft announced the Surface 3 release date. I sort of got the idea, they will probably never come to Australia.

For me, a power keyboard is a nice-to-have, but I really want a docking station! Therefore, I have to look elsewhere. I soon found 2 possible alternatives (USB 3 docking stations).

Toshiba Dynadock V.S. Targus USB 3 Dual Video Dock

Toshiba Dynadock U3.0

to_pa3927u_300

Targus USB3.0 SuperSpeed Dual Video Docking Station

ACP70USZ_accessories_b

Both of them have similar specs, The local retail price for the Toshiba one is around AUD $160 and Targus is around $180 (currently $1 AUD = $0.94 USD). I’ve decided to go for the Tagus one simply because the Toshiba dock is vertical with a stand, it will be harder to carry around (if I want to). The Targus dock seems to be more portable to me.

So instead of buying it in a retail shop, I managed to find a seller on eBay U.S. who accepts “Best Price”. After bargaining the price back and forth few times, I managed to get a brand new one for $85 USD. with international shipping, in the end, I paid AUD $118, which I’m very happy about the price!

Targus Dock V.S. Surface Dock

Here’s a specs comparison between the Targus dock and the Surface Pro 2 dock:

Targus USB3.0 Dual Video Dock Surface Pro 2 Dock
Video 1xDVI, 1xHDMI 1xMini Display Port
USB Ports 2xUSB3, 4xUSB2 1xUSB3, 3xUSB2
NIC 1xGB NIC 1x 10/100 NIC
Audio 1x 3.5mm speaker, 1×3.5mm mic 1x 3.5mm speaker, 1×3.5mm mic
Power Supply for Surface No Yes
Security Lock Yes No

The Targus dock also comes with a DVI-To-VGA adapter and a HDMI-To-DVI adapter to cater for different monitor connections. Based on the comparison above, the Targus dock is definitely more feature rich. Since I’ve already bought a spare Surface Pro 2 power supply from eBay, I didn’t mind the fact that I can’t power the Surface with this dock.

More Pictures

Here’s the back view:

Targus Back

Using it with my Surface Pro 2:

20140724_210144

Physical size comparing with Surface Pro 2:

20140724_212815

Drivers

I have no problems with drivers, all the drivers got automatically installed when I connected them for the first time.

Cameron Fuller wrote an article on his experience with Surface 2 RT: Using the Surface 2 RT like a Pro-fessional. In Cameron’s article, he listed all the hardware accessories that he has purchased for the RT device. I’m guessing RT devices would always face compatibility issues because of drivers, I haven’t been managed to find an RT device to test this dock with, so I’m not sure if it supports Windows RT.

Replacement for Other Devices

Down here in Australia, I looked up prices for a USB 3 video adapter. it is around $100 AUD (around $94 USD). By getting a docking station like this, it is equivalent of getting:

  • 2x USB 3 video adapter
  • 1x USB 3 or USB 2 hub
  • 1x GB USB NIC

So it is definitely a cheaper option to get the dock instead, not to mention you end up with only one device on your desk.

So now, even if Surface docking station has been made available in Australian market, I’d still stick with this Targus dock, simply because I can connect 2 external monitors.

The only thing I haven’t tried is testing PXE through the NIC port on this dock. If someone has already tried it, please let me know Smile.

OpsMgr 2012: A Trick to Drive Another Contextual Widget From PowerShell Grid Widget

Written by Tao Yang

PowerShell Grid widget and PowerShell Web Browser Widget were released as part of OpsMgr 2012 SP1 UR6 and R2 UR2. To me, these two widgets have opened a window of opportunities, because by using PowerShell, it allows OpsMgr 2012 users to customise and present the data exactly the way they wanted on dashboards.

Since it has been released, many people have share their work. Recently, Microsoft has started a new repository for the PowerShell widgets in TechNet Gallery.

The best article for the PowerShell Grid Widget that I have seen so far is from Oleg Kapustin’s blog: SCOM Powershell Grid Widget for Mere Mortals. In Oleg’s article (and seems to be a common practice), for each item that to be listed by the PowerShell Grid Widget, a unique Id is assigned to it (an auto incremented number):

image

Today, I want to share a small trick with you, something I’ve only picked up couple of days ago when I was writing the Location History dashboard for the 3rd part of my Location, Location, Location series. This is what the dashboard looks like:

SNAGHTML52cdecf

On this dashboard, users suppose to make their way from section 1 (state widget) to section 2 (PowerShell Grid Widget) and finally to section 3 (PowerShell Web Browser Widget). The PowerShell script in section 2 retrieves particular events generated by the object from section 1 using OpsMgr cmdlets, then display the data on this customised list. This script is listed below:

Param($globalSelectedItems)

$i = 1
foreach ($globalSelectedItem in $globalSelectedItems)
{
 $MonitoringObjectID = $globalSelectedItem["Id"]
 $MG = Get-SCOMManagementGroup
 $globalSelectedItemInstance = Get-SCOMClassInstance -Id $MonitoringObjectID
 $Computername = $globalSelectedItemInstance.DisplayName
 $strInstnaceCriteria = "FullName='Microsoft.Windows.Computer:$Computername'"
 $InstanceCriteria = New-Object Microsoft.EnterpriseManagement.Monitoring.MonitoringObjectGenericCriteria($strInstnaceCriteria)
 $Instance = $MG.GetMonitoringObjects($InstanceCriteria)[0]
 $Events = Get-SCOMEvent -instance $Instance -EventId 10001 -EventSource "LocationMonitoring" | Where-Object {$_.Parameters[1] -eq 4} |Sort-Object TimeAdded -Descending | Select -First 50
 foreach ($Event in $Events)
 {
 $EventID = $Event.Id.Tostring()
 $LocalTime = $Event.Parameters[0]
 $LocationStatus = $Event.Parameters[1]
 $Latitude = $Event.Parameters[2]
 $Longitude = $Event.Parameters[3]
 $Altitude = $Event.Parameters[4]
 $ErrorRadius = $Event.Parameters[5].trimend(".")
 
 $dataObject = $ScriptContext.CreateInstance("xsd://foo!bar/baz")
 $dataObject["Id"]=$EventID
 $dataObject["No"]=$i
 $dataObject["LocalTime"]=$LocalTime
 $dataObject["Latitude"]=$Latitude
 $dataObject["Longitude"]=$Longitude
 $dataObject["Altitude"]=$Altitude
 $dataObject["ErrorRadius (Metres)"]=$ErrorRadius
 $ScriptContext.ReturnCollection.Add($dataObject)
 $i++
 } 
}

 

Because I need to drive the contextual PowerShell Web Browser widget (section 3) from the PowerShell Grid Widget (section 2), the script used in section 3 needs to locate the exact event selected in section 2. As per Oleg’s article, based on his experiment, the only property passed between widgets is the “Id” property (of the data object). therefore, instead of using an auto increment number as the value for “Id” property as demonstrated in the previous screenshot from Oleg’s blog, I assigned the actual event Id as the data object Id so script in section 3 can use the event ID to retrieve data from the particular event.

From Section 2:

image

From Section 3:

image

Conclusion

Please keep in mind, the only property (and its value) for $globalselectedItems that travels between contextual widgets is “Id” property. if you want to drive another contextual widget based on the data passed from a PowerShell Grid Widget, please make sure you use the actual Id of the OpsMgr object (monitoring object, class, event, alert, etc…) so the next widget can use this Id to retrieve the object from OpsMgr.

New OpsMgr 2012 Dashboards Repository in TechNet Gallery

Written by Tao Yang

With the recent release of OpsMgr 2012 SP1 UR6 and R2 UR2, number of new dashboard widgets have been made available. The PowerShell Grid Widget and PowerShell Web Browser Widget are 2 of my favourite ones.

Microsoft has just created a new repository for the community to share their scripts and dashboards. This repository is located in the TechNet Gallery Script Center. You can access it from this direct link: http://bit.ly/Wy168U. or go to the script center and browse to: System Center > Operations Manager Dashboards:

image

Looks like the product team has already posted 4 samples in the first day. In the next few days, I will also post few of mine to this place. I encourage everyone to keep eye on this place from now on, and please do not hesitate to share your work with the community!

Location, Location, Location. Part 3

Written by Tao Yang

location-graphicThis is the 3rd and the final part of the 3-part series. In this post, I will demonstrate how do I track the physical location history for Windows 8 location aware computers (tablets and laptops), as well as how to visually present the data collected on a OpsMgr 2012 dashboard.

I often see people post of Facebook or Twitter that he or she has checked in at <some places> on Foursquare. I haven’t used Foursquare before (and don’t intend to in the future), I’m not sure what is the purpose of it, but please think this as Four Square in OpsMgr for your tablets Smile. I will now go through the management pack elements I created to achieve this goal.

Event Collection Rule: Collect Location Aware Device Coordinate Rule

So, I firstly need to collect the location data periodically. Therefore, I created an event collection rule targeting the “Location Aware Windows Client Computer” class I created (explained in Part 2 of this series). This rule uses the same data source module as the “Location Aware Device Missing In Action Monitor” which I also explained in Part 2. I have configured this rule to pass the exact same data to the data source module as what the monitor does, – so we can utilise Cook Down (basically the data source only execute once and feed the output data to both the rule and the monitor).

image

image

Note: Although this rule does not require the home latitude and longitude and these 2 inputs are optional for the data source module, I still pass these 2 values in. Because in order to use Cook Down, both workflows need to pass the exact same data to the data source module. By not doing this, the same script will run twice in each scheduling cycle.

This rule maps the data collected from the data source module to event data, and stores the data in both Ops DB and DW DB. I’ve created a event view in the management pack, you can see the events created:

SNAGHTMLb60c734

Location History Dashboard

Now, that the data has been captured and stored in OpsMgr databases as event data, we can consume this data in a dashboard:

SNAGHTMLb65f9e4

As shown above, there are 3 widgets in this Location History dashboard:

  • Top Left: State Widget for Location Aware Windows Client Computer class.
  • Bottom Left: Using PowerShell Grid widget to display the last 50 known locations of the selected device from the state widget.
  • Right: Using PowerShell Web Browser widget to display the selected historical location from bottom left PowerShell Grid Widget.

The last 50 known locations for the selected devices are listed on bottom left section. Users can click on the first column (Number) to sort it based on the time stamp. When a previous location is selected, this location gets pined on the map. So we known exactly where the device is at that point of time. – From now on, I need to make sure my wife doesn’t have access to OpsMgr in my lab so she can’t track me down Smile.

Note: the location shown in above screenshot is my office. I took my Surface to work, powered it on and connected to a 4G device, it automatically connected to my lab network using DirectAccess.

Surface in car

Since this event was collected over 2 days ago, for demonstration purpose, I had to modify the PowerShell grid widget to list a lot more than 50 previous locations.

The script below is what’s used in the bottom left PowerShell Grid widget:

Param($globalSelectedItems)

$i = 1
foreach ($globalSelectedItem in $globalSelectedItems)
{
$MonitoringObjectID = $globalSelectedItem["Id"]
$MG = Get-SCOMManagementGroup
$globalSelectedItemInstance = Get-SCOMClassInstance -Id $MonitoringObjectID
$Computername = $globalSelectedItemInstance.DisplayName
$strInstnaceCriteria = "FullName='Microsoft.Windows.Computer:$Computername'"
$InstanceCriteria = New-Object Microsoft.EnterpriseManagement.Monitoring.MonitoringObjectGenericCriteria($strInstnaceCriteria)
$Instance = $MG.GetMonitoringObjects($InstanceCriteria)[0]
$Events = Get-SCOMEvent -instance $Instance -EventId 10001 -EventSource "LocationMonitoring" | Where-Object {$_.Parameters[1] -eq 4} |Sort-Object TimeAdded -Descending | Select -First 50
foreach ($Event in $Events)
{
$EventID = $Event.Id.Tostring()
$LocalTime = $Event.Parameters[0]
$LocationStatus = $Event.Parameters[1]
$Latitude = $Event.Parameters[2]
$Longitude = $Event.Parameters[3]
$Altitude = $Event.Parameters[4]
$ErrorRadius = $Event.Parameters[5].trimend(".")

$dataObject = $ScriptContext.CreateInstance("xsd://foo!bar/baz")
$dataObject["Id"]=$EventID
$dataObject["No"]=$i
$dataObject["LocalTime"]=$LocalTime
$dataObject["Latitude"]=$Latitude
$dataObject["Longitude"]=$Longitude
$dataObject["Altitude"]=$Altitude
$dataObject["ErrorRadius (Metres)"]=$ErrorRadius
$ScriptContext.ReturnCollection.Add($dataObject)
$i++
}
}

 

And here’s the script for the PowerShell Web Browser Widget:

Param($globalSelectedItems)

$dataObject = $ScriptContext.CreateInstance("xsd://Microsoft.SystemCenter.Visualization.Component.Library!Microsoft.SystemCenter.Visualization.Component.Library.WebBrowser.Schema/Request")
$dataObject["BaseUrl"]="<a href="http://maps.google.com/maps&quot;">http://maps.google.com/maps"</a>
$parameterCollection = $ScriptContext.CreateCollection("xsd://Microsoft.SystemCenter.Visualization.Component.Library!Microsoft.SystemCenter.Visualization.Component.Library.WebBrowser.Schema/UrlParameter[]")
foreach ($globalSelectedItem in $globalSelectedItems)
{
$EventID = $globalSelectedItem["Id"]
$Event = Get-SCOMEvent -Id $EventID
If ($Event)
{
$bIsEvent = $true
$Latitude = $Event.Parameters[2]
$Longitude = $Event.Parameters[3]

$parameter = $ScriptContext.CreateInstance("xsd://Microsoft.SystemCenter.Visualization.Component.Library!Microsoft.SystemCenter.Visualization.Component.Library.WebBrowser.Schema/UrlParameter")
$parameter["Name"] = "q"
$parameter["Value"] = "loc:" + $Latitude + "+" + $Longitude
$parameterCollection.Add($parameter)
} else {
$bIsEvent = $false
}
}
If ($bIsEvent)
{
$dataObject["Parameters"]= $parameterCollection
$ScriptContext.ReturnCollection.Add($dataObject)
}

Conclusion

This concludes the 3rd and the final part of the series. I know it is only a proof-of-concept. I’m not sure how practical it is if we are to implement this in a corporate environment. i.e. Since most of the current Windows tablets don’t have GPS receivers built-in, I’m not sure and haven’t been able to test how well does the Windows Location Provider calculate locations when a device is connected to a corporate Wi-Fi.

I have also noticed what seems to be a known issue with the Windows Location Provider COM object LocationDisp.LatLongReportFactory. it doesn’t always return a valid location report. Therefore to work around the issue, I had to code all the scripts to retry and wait between attempts. I managed to get the script to work on all my devices. However, you may need to tweak the scripts if you don’t always get valid location reports.

Credit

Other than the VBScript I mentioned in Part 2, I was lucky enough to find this PowerShell script. I used this script as the starting point for all my scripts.

Also, when I was trying to setup DirectAccess to get my lab ready for this experiment, I got a lot of help from Enterprise Security MVP Richard Hick’s blog: http://directaccess.richardhicks.com. So thanks to Richard Smile.

Download

You can download the actual monitoring MP and dashboard MP, as well as all the scripts I used in the MP and dashboards HERE.

Note: For the monitoring MP (Location.Aware.Devices.Monitoring), I’ve also included the unsealed version in the zip file for your convenience (so you don’t have to unseal it if you want to look inside). Please do not import it into your management group because the dashboard MP is referencing it, therefore it has to be sealed.

Lastly, as always, I’d like to hear from the community. Please feel free to share your thoughts with me by leaving comments in the post or contacting me via email. Until next time, happy SCOMMING Smile.