Author Archives: Tao Yang

My Experience of Using Silect MP Studio During Management Pack Update Process

Written by Tao Yang

Thanks to Silect’s generosity, I have been given a NFR (Not For Resell) license for the MP Studio to be used in my lab last November. When I received the license, I created a VM and installed it in my lab straightaway.  However, due to my workloads and other commitments, I haven’t been able to spend too much time exploring this product. In the mean time, I’ve been trying to get all the past and current MS management packs ready so I can load them into MP Studio to build my repository.

Today, one of my colleagues came to me seeking help on an error logged in the Operations Manager log on all our DPM 2012 R2 servers (where SQL is locally installed):

image

It’s obvious the offending script(GetSQL2012SPNState.vbs) is from the SQL 2012 MP, and we can tell the error is that the computer FQDN where WMI is trying to connect to is incorrect. In the pixelated area, the FQDN contains the NetBIOS computer name plus 2 domain names from the same forest.

I knew the SQL MP in our production environment is 2 version behind (Currently on version 6.4.1.0), so I wanted to find out if the latest one (6.5.4.0) has fixed this issue.

Therefore, as I always do, I firstly went through the change logs in the MP guide. The only thing I can find that might be related to SPN monitoring is this line:

SPN monitor now has overridable ‘search scope’ which allows the end user to choose between LDAP and Global Catalog

image

I’m not really sure if the new MP is going to fix the issue, and no, I don’t have time to unseal and read the raw XML code to figure it out because this version of the SQL 2012 monitoring MP has 49,723 lines of code!

At this stage, I thought MP Studio might be able to help (by comparing 2 MPs). So I remoted back to my home lab and quickly loaded all versions of SQL MP that I have into MP Studio.

SNAGHTML10c3cde7

I then chose to compare version 6.5.4.0 (the latest version) with version 6.4.1.0 (the version loaded in my production environment):

image

It took MP Studio few seconds to generate the comparison result, and I was surprised how many items have been updated!

image

Unfortunately, there is no search function in the comparison result window, but fortunately, I am able to export the result to Excel. When I exported to Excel, there are 655 rows! when I searched the script name mentioned in the Error log (GetSQL2012SPNState.vbs), I found the script was actually updated:

image

Because the script is too long and it’s truncated in the Excel spreadsheet, I had to go back to MP Studio and find this entry (luckily entries are sorted alphabetically).

Once the change is located, I can copy both parent value and the child value into clipboard:

image

I pasted the value into Notepad++, as it contains some XML headers / footers and both versions of script, I removed headers and footers, and separated the scripts into 2 files.

Lastly, I used the Compare Plugin from NotePad++ to compare the differences in both scripts, and I found an additional section in the new MP (6.5.4.0) may be related to the error that we are getting (as it has something to do with generating the domain FQDN):

image

After seeing this, I took an educated guess that this could be the fix to our issue and asked my colleague to load MP version 6.5.4.0 into our Test management group to see if it fixes the issue. When we went to load the MP, we found out that I have already loaded it in Test (I’ve been extremely busy lately and I forgot I did it!). So my colleague checked couple of DPM servers in our test environment and confirmed the error does not exist in Test. It seems we have nailed this issue.

Conclusion

Updating management packs has always been a challenging task (for everyone I believe). In my opinion, we are all facing challenges because not knowing EXACTLY what has been changed. This is because:

  • It is impossible to read and compare each MP files (i.e. the SQL 2012 Monitoring MP has around 50,000 lines of code, then plus the 2008 and 2005 MP, plus the library MP, etc.), they are just too big to read!
  • MP Guide normally only provides a vague description in the change log (if the are change logs after all).
  • Any bugs caused by the human error would not be captured in the change logs.
  • Sometimes it is harder to test a MP in test environment because test environments normally don’t have the same load as production, therefore it is harder to test some workflows (i.e. performance monitors).

And we normally rely on the following sources to make our judgement:

  • The MP Guide. – Only if the changes are captured in the guide, and they are normally very vague.
  • Social media (tweets and blogs) – but this is only based on the blog author’s experience, the bug you have experienced may not been seen in other people’s environment (i.e. this particular error I mentioned in this post probably won’t happen in my lab because I only have a single domain in the forest).

Normally, you’d wait someone else to be the guinea pig, test it out and let you know if there are any issues before you start updating your environment (i.e. the recent bug in Server OS MP 6.0.7294.0 was firstly identified by SCCDM MVP Daniele Grandini and it was soon been removed from the download site by microsoft).

In MP Studio, the feature that I wanted to explore the most is the MP compare function. It really provides OpsMgr administrators a detailed view on what has been changed in the MP and you (as OpsMgr admin) can use this information to make better decisions (i.e. whether to upgrade or not? are there any additional overrides required?). Based on today’s experience, if I start timing before I loaded the MPs into the repository, it probably took me less than 15 minutes to identify this MP update is something very worth trying (in order to fix my production issue).

Lastly, There are many other features MP Studio provides, I have only spent a little bit time on it today (and the result is positive). In my opinion, sometimes, the best way to describe something is to use an example, thus I’m sharing today’s experience with you. I hope you’ve found it informative and useful.

P.S. Coming back to the bug in the Server OS MP 6.0.7294 that I have mentioned above, I ran a comparison between 6.0.7294 and previous version 6.0.7292, I can see a lot of perf collection rules have been changed:

image

and if I export the result in Excel, I can actually see the issue described by Daniele (highlighted in yellow):

image

Oh, one last word before I call it the day, to Silect – would it be possible to provide search function in the comparison result window (so I don’t have to rely on Excel export)?

Writing PowerShell Modules That Interact With Various SDK Assemblies

Written by Tao Yang

Over the last few months, there have been few occasions that I needed to develop PowerShell scripts needed to leverage SDK DLLs from various products such as OpsMgr 2012 R2, SCSM 2012 R2 and SharePoint Client Component SDK.

In order to be able to leverage these SDK DLLs, it is obvious that prior to running the scripts, these DLLs must be installed on the computers where the scripts are going to be executed. However, this may not always be possible, for example:

  • Version Conflicts (i.e. OpsMgr): The OpsMgr SDK DLLs are installed into computer’s Global Assembly Cache (GAC) as part of the installation for the Management Server, Operations Console or the Web Console. However, you cannot install any components from multiple OpsMgr versions on the same computer (i.e. Operations console from OpsMgr 2012 and 2007).
  • Not being able to install or copy SDK DLLs (i.e. Azure Automation): If the script is a runbook in Azure Automation, you will not be able to pre-install the SDK assemblies on the runbook servers.

 

In order to be able to overcome these constraints, I have developed a little trick: developing a simple PowerShell module, placing the required DLLs in the PS module folder and use a function in the module to load the DLLs from the PS Module base folder. I’ll now explain how to develop such PS module. I’ll use the custom module I’ve created for the Service Manager 2012 R2 SDK last week as an example. In this example, I named my customised module “SMSDK”.

01. Firstly, create a module folder and then create a new PowerShell module manifest using “New-ModuleManifest” cmdlet.

02. Copy the  required SDK DLLs into the PowerShell Module Folder. The module folder would also contain the manifest file (.psd1) and a module script file (.psm1).

SNAGHTML12c34a9

03. Create a function to load the DLLs. In the “SMSDK” module that I’ve written, the function looks like this:

As you can see, In this function, I have hardcoded the DLL file names, assembly version and public key token. The script will try to load the assemblies (with the specific names, version and public key token) from the Global Assembly Cache first (line 32). If the assemblies are not located in the GAC, it will load the assemblies from the DLLs located in the PS Module folder (line 38).

The key to this PS function is, you must firstly identify the assemblies version and public key token. There are 2 ways to can do this:

  • Using the PowerShell GAC module on a machine where the assemblies have already been loaded into the Global Assembly Cache (i.e. in my example, the Service Manager management server):

image

  • Load the assemblies from the DLLs and then get the assemblies details from the current app domain:

image

Note: although you can load the assemblies from the GAC without specifying the version number, in this scenario, you MUST specify the version to ensure the correct version is loaded. It happened to me before when I developed a script that uses OpsMgr SDK, it worked on most of the computers but one computer. It took me a while to find out because the computer had both OpsMgr and Service Manager SDKs loaded in the GAC, the wrong assembly was loaded because I didn’t not specify the version number in the script.

Now, Once the Import SDK function is finalised, you may call it from scripts or other module functions. For example, in my “SMSDK” module, I’ve also created a function to establish connection to the Service Manager management group, called Connect-SMManagementGroup. This function calls the Import SDK (Import-SMSDK) function to load assemblies before connecting to the Service Manager management group:

For your reference, You can download the sample module (SMSDK) HERE. However, the SDK DLLs are not included in this zip file. For Service Manager, you can find these DLLs from the Service Manager 2012 R2 Management Server, in the <SCSM Management Server Install Dir>\SDK Binaries folder and manually copy them to the PS module folder:

image

Lastly, this blog is purely based on my recent experiences. Other than the Service Manager module that I’ve used in this post, I’ve also used this technique in few of my previous work, i.e. the “SharePointSDK” module and the upcoming “OpsMgrExtended” module that will soon be published (You can check out the preview from HERE and HERE). I’d like to hear your thoughts, so please feel free to email me if you’d like to discuss further.

Updated Management Pack for Windows Server Logical Disk Auto Defragmentation

Written by Tao Yang

defragBackground

I have been asked to automate Hyper-V logical disk defragmentation to address a wide-spread production issue at work. Without having a second look, I went for the famous Autodefrag MP authored by my friend and SCCDM MVP Cameron Fuller.

Cameron’s MP was released in Octorber, 2013, which is around 1.5 year ago. When I looked into Cameron’s MP, I realised unfortunately, it does not meet my requirements.

I had the following issues with Cameron’s MP:

  • The MP schema is based on version 2 (OpsMgr 2012 MP schema), which prevents it from being used in OpsMgr 2007. This is a show stopper for me as I need to use it on both 2007 and 2012 management groups.
  • The monitor reset PowerShell script used in the AutoDefrag MP uses OpsMgr 2012 PowerShell module, which won’t work with OpsMgr 2007.
  • The AutoDefrag MP was based on Windows Server OS MP version 6.0.7026. In this version, the fragmentation monitors are enabled by default. However, since version 6.0.7230, these fragmentation monitors have been changed to be disabled by default. Therefore, the overrides in the AutoDefrag MP to disable these monitors become obsolete since they are already disabled.

In the end, I have decided to rewrite this MP, but it’s still based on Cameron’s original logics.

New MP: Windows Server Auto Defragment

I’ve given the MP a new name: Windows Server Auto Defragment (ID: Windows.Server.Auto.Defragment).

The MP includes the following components:

Diagnostic Tasks: Log defragmentation to the Operations Manager Log

There are 3 identical diagnostic tasks (for Windows Server 2003, 2008 and 2012 logical disk fragmentation monitors). These tasks log an event log entry to agent’s Operations Manager log before the defrag recovery tasks starts.

Group: Drives to Enable Fragmentation Monitoring

This is an empty instance group. Users can place logical disks into this group to enable the “Logical Disk Fragmentation Level” monitors from the Microsoft Windows Server OS MPs.

You may add any instances of the following classes into this group:

  • Windows Server 2003 Logical Disk
  • Windows Server 2008 Logical Disk
  • Windows Server 2012 Logical Disk

 

Group: Drives to Enable Auto Defrag

This is an empty instance group. Users can place logical disks into this group to enable the diagnostic and recovery tasks for auto defrag.

You may add any instances of the following classes into this group:

  • Windows Server 2003 Logical Disk
  • Windows Server 2008 Logical Disk
  • Windows Server 2012 Logical Disk

 

Group: Drive to Enable Fragmentation Level Performance Collection

This is an empty instance group. Users can place logical disks into this group to enable the Windows Server Fragmentation Level Performance Collection Rule.

Note: Since this performance collection rule is targeting the “Logical Disk (Server)” class, which is the parent class of OS specific logical disk classes, you can simply add any instances of the “Logical Disk (Server)” class into this group.

Event Collection Rule: Collect autodefragmentation event information

This rule collects the event logged by the “Log defragmentation to the Operations Manager Log” diagnostic tasks.

Reset Disk Fragmentation Health Rule

This rule is targeting the RMS / RMS Emulator, it runs every Monday at 12:00 and resets any unhealthy instances of disk fragmentation monitors back to healthy (so the monitor regular detection and recovery would run again next weekend).

Auto Defragmentation Event Report

This report lists all auto defragmentation events collected by the event collection rule within a specified time period

image

Windows Server Fragmentation Level Performance Collection Rule

This rule collects the File Percent Fragmentation counter via WMI for Windows server logical disks. This rule is disabled by default.

If a logical drive has been placed into all 3 above groups as I mentioned above, you’d probably see a performance graph similar to this:

image

As shown in above figure, Number 1 indicates the monitor has just ran and the defrag recovery task was executed, the drive has been defragmented. Number 2, 3 and 4 indicates the fragmentation level is slowly building up over the week and hopefully you’ll see this similar pattern on a weekly interval (because the fragmentation level monitor runs once a week by default).

Various views

The MP also contains various views under the “Windows Server Logical Drive Auto Defragment” folder:

image

What’s Changed from the Original AutoDefrag MP?

Comparing with Cameron’s original MP, I have made the following changes in the new version:

  • The MP is based on MP schema version 1, which works with OpsMgr 2007 (as well as OpsMgr 2012).
  • Changed the minimum version of all the referencing Windows Server MPs to 6.0.7230.0 (where the fragmentation monitors became disabled by default).
  • Sealed the Windows Server Auto Defragment MP. However, in order to allow users to manually populate groups, I have placed the group discoveries into an unsealed MP “Windows Server Auto Defragment Group Population”. By doing so, all MP elements are protected (in the sealed MP), but still allowing users to use the groups defined in the MP to manage auto defrag behaviours.
  • Changed the monitor overrides from disabled to enabled because these monitors are now disabled by default. This means the users will now need to manually INCLUDE the logical disks to be monitored rather than excluding the ones they don’t want.
  • Replaced the Linked Report with a report to list auto defrag events.
  • Additional performance collection rule to collect the File Percent Fragmentation counter via WMI. This rule is also disabled by default. It is enabled to a group called “Drives to Enable Fragmentation Level Performance Collection”
  • Updated the monitor reset script to use SDK directly. This change is necessary in order to make it work for both OpsMgr 2007 and 2012. The original script would reset the monitor on every instance, the updated script would only reset the monitors for the unhealthy instances. Additionally, the monitor reset results are written to the RMS / RMSE’s Operations Manager log.
  • Updated LogDefragmentation.vbs script for the diagnostic task to use MOM.ScriptAPI to log the event to Operations Manager log instead of the Application log.
  • Updated message in LogDefragmentation.vbs from “”Operations Manager has performed an automated defragmentation on this system” to “Operations Manager will perform an automated defragmentation for <Drive Letter> drive on <Server Name>” – Because this diagnostic task runs at the same time as the recovery task, so the defrag is just about to start, not finished yet, I don’t believe the message should use past tense.
  • Updated the diagnostic tasks to be disabled by default.
  • Created overrides to enable the diagnostics for the “Drives to Enable Auto Defrag” group (same group where the recovery tasks are enabled).
  • Updated the Data Source module of the event collection rule to use “Windows!Microsoft.Windows.ScriptGenerated.EventProvider” and it is only looking for event ID 4 generated by the specific script (LogDefragmentation.vbs). –by using this data source module, we can filter by the script name to give us more accurate detection.

 

How do I configure the management pack?

Cameron suggested me to use the 5 common scenarios from his original post when explaining different monitoring requirements. In Cameron’s post, he has listed the following 5 scenarios:

01. We do not want to automate defragmentation, but we want to be alerted to when drives are highly fragmented.

In this case, you will need to place the drives that you want to monitor in the “Drives to Enable Fragmentation Monitoring” group.

02. We want to ignore disk fragmentation levels completely.

In this case, you don’t need to import this management pack at all. Since the fragmentation monitors are now disabled by default, this is the default configuration.

03. We want to auto defragment all drives.

In this case, you will need to place all the drives that you want to auto defrag into 2 groups:

  • Drives to Enable Fragmentation Monitoring
  • Drives to Enable Auto Defrag

04. We want to auto defragment all drives but disable monitoring for fragmentation on specific drives.

Previously when Cameron released the original version, he needed to work on an exclusion logic because the fragmentation monitors were enabled by default. With the recent releases of Windows Server OS Management Packs, we need to work on a inclusion logic instead. So, in this case, you will need to add all drives that you want to monitor fragmentation level to the “Drives to Enable Fragmentation Monitoring” group, and put a subset of these drives to “Drives to Enable Auto Defrag” group.

05. We want to auto defragment all drives but disable automated defragmentation on specific drives.

This case would be similar to case #3: you will need to place the drives that you are interested in into these 2 groups:

  • Drives to Enable Fragmentation Monitoring
  • Drives to Enable Auto Defrag

In addition to these 5 scenarios, another scenario this MP is catered for is:

06. We want to collect drive fragmentation level as performance data

In this case, if you want to simply collect the fragmentation level as perf data (with or without fragmentation monitoring), you will need to add the drives that you are interested in into the “Drives to Enable Fragmentation Level Performance Collection” group.

So, How do I configure these groups?

By default, I have configured these groups to have a discovery rule to discover nothing on purpose:

image

As you can see, the default group discoveries are looking for any logical drives with the device name (drive letter) matches regular expression ^$. “^$” represent blank / null value. Since all the discovered logical device would have a device name, these groups will be empty. You will need to modify the group memberships to suit your needs.

For example, if you want to include C: drive of all the physical servers, the group membership could be something like this:

grouppop

Note: In SCOM, only Hyper-V VMs are discovered as virtual machines. if you are running other hypervisors, the “virtual machine” property probably wont work.

MP Download

There are 2 management pack files included in this solution. You can download them HERE.

image

Credit

Thanks Cameron for sharing the original MP with the community and providing guidance, review and testing on this version. I’d also like to thank all other OpsMgr focused MVP folks who have been involved in this discussion.

Lastly, as always, please feel free to contact me if you have questions / issues with this MP.

Session Recording for My Presentation in Microsoft MVP Community Camp Melbourne Event

Written by Tao Yang

image

Last Friday, I presented in the Melbourne MVP Community Camp day, on the topic of “Automating SCOM Tasks Using SMA”.

I have uploaded the session recording to YouTube. You can either watch it here:

If you’d like to watch it in full screen, please go to: https://www.youtube.com/watch?v=QW99bVFKg80

 

Or on YouTube: https://www.youtube.com/watch?v=QW99bVFKg80

You can also download the presentation deck from HERE.

And Here’s the sample script I used in my presentation when I explained how to connect to SCOM management group via SDK:

Overall, I think I could have done better – as I wasn’t in the best shape that day. I have been sick for the last 3 weeks (dry cough passed on to me from my daughter). The night before the presentation, I was coughing none-stop and couldn’t go to sleep. I then got up, looked up the Internet and someone suggested that sleeping upright might help. I then ended up slept on the couch for 2.5 hours before got up and drove to Microsoft’s office. So I was really exhausted even before I got on stage. Secondly, the USB external Microphone didn’t work on my Surface, so the sound was recorded from the internal mic – not the best quality for sure.

Anyways, for those who’s watching the recording online, I’m really interested in hearing back from you if you have any suggestions or feedbacks in regards to the session itself, or the OpsMgrExtended module that I’m about to release. So, please feel free to drop me an email if you like Smile.

Microsoft MVP Community Camp 2015 and My Session for SMA Integration Module: OpsMgrExtended

Written by Tao Yang

141215_comcamp2015_Melbourne_01

On next Friday (30th Jan, 2015), I will be speaking at the Microsoft MVP Community Camp Day in Melbourne. I am pretty excited about this event as this is going to be my first presentation since I have become a System Center MVP in July 2014.

My session is titled “Automating SCOM tasks using SMA”. Although this name sounds a little bit boring, let me assure you, it won’t be boring at all! The stuff I’m going to demonstrate is something I’ve been working on during my spare time over the last 6 month, and so far I’ve already written over 6,000 lines of PowerShell code. Basically, I have created a module called “OpsMgrExtended”. this module can be used as a SMA Integration Module as well as a standalone PoewrShell module. It directly interact with OpsMgr SDKs, and can be used by SMA runbooks or PowerShell scripts to perform some advanced tasks in OpsMgr such as configuring management groups, creating rules, monitors, groups, overrides, etc.

If you have heard or used my OpsMgr Self Maintenance MP, you’d know that I have already automated many maintenance / administrative tasks in this MP, using nothing but OpsMgr itself. In this presentation, I will not be showing you anything that’s already been done by the Self Maintenance MP. I will heavily focus on automating management pack authoring tasks.

To date, I haven’t really discussed this piece of work in details with anyone other than few SCOM focused MVPs (and my wife of course). This is going to be the first time I’m demonstrating this project in public.

In order to help promoting this event, and also, trying to “lure” you to come to my session if you are based in Melbourne, I’ve recorded a short video demonstrating how I’ve automated the creation of a blank MP and then a Performance Monitor rule (with override) using SharePoint, Orchestrator and SMA. I will also include this same demo in my presentation, and it is probably going to be one of the easier ones Smile.

I’ve uploaded the recording to YouTube, you can watch from https://www.youtube.com/watch?v=aX9oSj_eKeY or from below:

Please watch in Youtube and switch to the full screen mode.

 

If you like what you saw and would like to see more and find out what’s under the hood, please come to this free event next Friday. You can register from here.

image

Creating OpsMgr Instance Group for All Computers Running an Application and Their Health Service Watchers

Written by Tao Yang

OK, the title of this blog is pretty long, but please let me explain what I’m trying to do here. In OpsMgr, it’s quite common to create an instance group which contains some computer objects as well as the Health Service Watchers for these computers. This kind of groups can be used for alert subscriptions, overrides, and also maintenance mode targets.

There are many good posts around this topic, i.e.

From Tim McFadden: Dynamic Computer groups that send heartbeat alerts

From Kevin Holman: Creating Groups of Health Service Watcher Objects based on other Groups

Yesterday, I needed to create several groups that contains computer and health service watcher objects for:

  • All Hyper-V servers
  • All SQL servers
  • All Domain Controllers
  • All ConfigMgr servers

Because all the existing samples I can find on the web are all based on computer names, so I thought I’ll post how I created the groups for above mentioned servers. In this post, I will not go through the step-by-step details of how to create these groups, because depending on the authoring tool that you are using the steps are totally different. But I will go through what the actual XML looks like in the management pack.

Step 1, create the group class

This is straightforward, because this group will not only contain computer objects, but also the health service watcher objects, we must create an instance group.

i.e. Using SQL servers as an example, the group definition looks like this:

  <TypeDefinitions>
    <EntityTypes>
      <ClassTypes>
        <ClassType ID="TYANG.SQL.Server.Computer.And.Health.Service.Watcher.Group" Accessibility="Public" Abstract="false" Base="MSIL!Microsoft.SystemCenter.InstanceGroup" Hosted="false" Singleton="true" />
      </ClassTypes>
    </EntityTypes>
  </TypeDefinitions>

Note: the MP alias “MSIL” is referencing “Microsoft.SystemCenter.InstanceGroup.Library” management pack.

Step 2, Find the Root / Seed Class from the MP for the specific application

Most likely, the application that you are working on (for instance, SQL server) is already defined and monitored by another set of management packs. Therefore, you do not have to define and discover these servers by yourself. The group discovery for the group you’ve just created need to include:

  • All computers running any components of the application (in this instance, SQL Server).
  • And all Health Service Watcher objects for the computers listed above.

In any decent management packs, when multiple application components are defined and discovered, most likely, the management pack author would define a root (seed) class, representing a computer that runs any application components (in this instance, we refer this as the “SQL server”). Once an instance of this seed class is discovered on a computer, there will be subsequent discoveries targeting this seed class that discovers any other application components (using SQL as example again, these components would be DB Engine, SSRS, SSAS, SSIS, etc.).

So in this step, we need to find the root / seed class for this application. Based on what I needed to do, the seed classes for the 4 applications I needed are listed below:

  • SQL Server:
    • Source MP: Microsoft.SQLServer.Library
    • Class Name: Microsoft.SQLServer.ServerRole
    • Alias in my MP: SQL
  • HyperV Server:
    • Source MP: Microsoft.Windows.HyperV.Library
    • Class Name: Microsoft.Windows.HyperV.ServerRole
    • Alias in my MP: HYPERV
  • Domain Controller:
    • Source MP: Microsoft.Windows.Server.AD.Library
    • Class Name: .Windows.Server.AD.DomainControllerRole
    • Alias in my MP: AD
  • ConfigMgr Server
    • Source MP: Microsoft.SystemCenter2012.ConfigurationManager.Library
    • Class Name: Microsoft.SystemCenter2012.ConfigurationManager.Server
    • Alias in my MP: SCCM

Tip: you can use MPViewer to easily check what classes are defined in a sealed MP. Use SQL as example again, in the Microsoft.SQLServer.Library:image

You can easily identify that “SQL Role” is the seed class because it is based on Microsoft.Windows.ComputerRole and other classes use this class as the base class. You can get the actual name (not the display name) from the “Raw XML” tab.

Step 3 Create MP References

Your MP will need to reference the instance group library, as well as the MP of which the application seed class is defined (i.e. SQL library):

image

Step 4 Create the group discovery

The last component we need to create is the group discovery.The Data Source module for the group discovery is Microsoft.SystemCenter.GroupPopulator, and there will be 2 <MembershipRule> sections.i.e. For the SQL group:

 

image

As shown above, I’ve translated each membership rule to plain English. And the XML is listed below. If you want to reuse my code, simply change the line I highlighted in above screenshot to suit your needs.

  <Monitoring>
    <Discoveries>
      <Discovery ID="TYANG.SQL.Server.Computer.And.Health.Service.Watcher.Group.Discovery" Enabled="true" Target="TYANG.SQL.Server.Computer.And.Health.Service.Watcher.Group" ConfirmDelivery="false" Remotable="true" Priority="Normal">
        <Category>Discovery</Category>
        <DiscoveryTypes>
          <DiscoveryRelationship TypeID="MSIL!Microsoft.SystemCenter.InstanceGroupContainsEntities" />
        </DiscoveryTypes>
        <DataSource ID="DS" TypeID="SC!Microsoft.SystemCenter.GroupPopulator">
          <RuleId>$MPElement$</RuleId>
          <GroupInstanceId>$MPElement[Name="TYANG.SQL.Server.Computer.And.Health.Service.Watcher.Group"]$</GroupInstanceId>
          <MembershipRules>
            <MembershipRule>
              <MonitoringClass>$MPElement[Name="Windows!Microsoft.Windows.Computer"]$</MonitoringClass>
              <RelationshipClass>$MPElement[Name="MSIL!Microsoft.SystemCenter.InstanceGroupContainsEntities"]$</RelationshipClass>
              <Expression>
                <Contains>
                  <MonitoringClass>$MPElement[Name="SQL!Microsoft.SQLServer.ServerRole"]$</MonitoringClass>
                </Contains>
              </Expression>
            </MembershipRule>
            <MembershipRule>
              <MonitoringClass>$MPElement[Name="SC!Microsoft.SystemCenter.HealthServiceWatcher"]$</MonitoringClass>
              <RelationshipClass>$MPElement[Name="MSIL!Microsoft.SystemCenter.InstanceGroupContainsEntities"]$</RelationshipClass>
              <Expression>
                <Contains>
                  <MonitoringClass>$MPElement[Name="SC!Microsoft.SystemCenter.HealthService"]$</MonitoringClass>
                  <Expression>
                    <Contained>
                      <MonitoringClass>$MPElement[Name="Windows!Microsoft.Windows.Computer"]$</MonitoringClass>
                      <Expression>
                        <Contained>
                          <MonitoringClass>$Target/Id$</MonitoringClass>
                        </Contained>
                      </Expression>
                    </Contained>
                  </Expression>
                </Contains>
              </Expression>
            </MembershipRule>
          </MembershipRules>
        </DataSource>
      </Discovery>
    </Discoveries>
  </Monitoring>

Result

After I imported the MP into my lab management group, all the SQL computer and Health Service Watcher objects are listed as members of this group:image

Detecting Windows License Activation Status Using ConfigMgr DCM and OpsMgr

Written by Tao Yang

Hello and Happy New year. You are reading my first post in 2015! This is going to a quick post, something I did this week.

Recently, during a ConfigMgr 2012 RAP (Risk and Health Assessment Program) engagement with Microsoft, it has been identified that a small number of ConfigMgr Windows client computers do not have their Windows License activated. The recommendation from the Microsoft ConfigMgr PFE who’s running the RAP was to create a Compliance (DCM) baseline to detect whether the Windows license is activated on client computers.

To respond to the recommendation from Microsoft, I quickly created a DCM baseline with 1 Configuration Item (CI). The CI uses a simple PowerShell script to detect the Windows license status.

image

I configured the CI to only support computers running Windows 7 / Server 2008 R2 and above (as per the minimum supported OS for the SoftwareLicensingProduct WMI class documented on MSDN: http://msdn.microsoft.com/en-us/library/cc534596(v=vs.85).aspx):

image

The CI is configured with 1 compliance rule:

image

Next, I created a Compliance baseline and assigned this CI to it. I then deployed the baseline to an appropriate collection. after few hours, the clients have started receiving the baseline and completed the first evaluation:

SNAGHTMLfeb9db8

Additionally, since I have implemented and configured the latest ConfigMgr 2012 Client MP (Version 1.2.0.0), this DCM baseline assignments on SCOM managed computers are also discovered in SCOM, any non-compliant status would be alerted in SCOM as well.

image

That’s all for today. It is just another example on how to use ConfigMgr DCM, OpsMgr and ConfigMgr 2012 Client MP to quickly implement a monitoring requirement.

This Concludes My Year 2014

Written by Tao Yang

It is one day away from the holiday season of the year. And I have worked HARD over the last few days so I can post my last technical post for the year 2014 before holidays.

First of all, I’d like to wish everyone a Merry Christmas and Happy New Year!

image

2014 has been a fantastic year for me. Here are some of the highlights for me in 2014:

I’ve been awarded as a Microsoft System Center Cloud and Data Center Management MVP for the first time in 1st July 2014.

This is truly my biggest accomplishment of the year. Not to mention being nominated by one of the most well known community leaders in System Center is an accomplishment by itself.

As part of a project team, the project team and I have successfully implemented one of the largest System Center 2012 infrastructures in the country (based on number of seats and number of System Center components implemented).

For those who knows me well, you probably know which one am I talking about Smile.

Had privilege and opportunity to attend Microsoft Global MVP Summit held in Redmond WA in November

I am so glad that I had the opportunity to attend such a wonderful event. Although pretty much everything is under NDA, I can’t really talk about the content of the sessions. I think I can share some pictures here (Some taken from the camera on my phone, some from the SLRs of other SCCDM MVPs).

image

image

image

image

image

image

image

Had opportunities to meet many big names (MVPs and Microsoft employees) in System Center during the MVP summit. Many of those have become good friends too.

I brought a lot of Tim Tam and Kangaroo Jerky to the summit. I didn’t expect Tim Tam to be so popular Smile. If I get awarded again in July 2015, I will make sure I’ll use a bigger suitcase for Tim Tam for the MVP Summit 2015.

image

image

Released a few new and updated Management Packs (ConfigMgr client MP, OpsMgr Self Maintenance MP, SCOM Maintenance Mode Scheduler MP, etc.), OpsMgr dashboards, PowerShell Scripts, SMA Modules etc. to the community.

I’ve lost count, but they should be all on this blog Smile.

Have written 63 blog posts (including this one) in total.

I don’t think the number is very high (only about 5 posts per month), but I’m trying my best Smile. Some of these posts are posting MPs, scripts etc. that I have spent a very long time on. Based on the content, I personally think this is quiet an achievement!

Clocked up over 170,000 hits on this blog in 2014 (to date).

Well, I think I still have a long way to go if comparing with some other popular System Center blogs (Not that I will turn this into a pissing contest). However t is a steady increase from 2013. But I’m sure I’ll do better next year.

What’s Next?

If everything goes as planned, this post will be my final words for 2014. I am taking some time off during the holiday seasons (well, not too long, going back to work on 5th Jan).

During my time off, I will probably spend few days working on an automation solution for OpsMgr – Something I’ve been working on during my spare time since August this year. This leads to the next point.

MVP ComCamp 2015

image

I have been chosen to speak at the MVP Community Camp 2015 in Melbourne on Friday 30th Jan 2015. I will be presenting the topic “Automating SCOM tasks using SMA”. This is something I’ve been working on since August this year. Personally, I think what I have done so far is really cool. This event is going to be held at the Microsoft Melbourne office at Freshwater Place, South Bank, Melbourne. Besides myself, a veteran MVP in System Center ConfigMgr, James Bannan will also deliver a session in Enterprise Mobility Suite in this event.  If you are based in Melbourne, please check out the detail of this event and sessions HERE. I am looking forward to speaking to the Melbourne based System Center folks Smile.

Lastly, I wish everyone have a wonderful time during this holiday season. I will be back in 2015 Smile.

A SMA Integration Module For SharePoint List Operations

Written by Tao Yang

Background

Many Microsoft System Center Orchestrator and Service Management Automation (SMA) users may agree with me, that these two automation platform does not have feature rich end user portals natively. Although System Center Service Manager can be used as a user portal for triggering SCORCH/SMA runbooks, Microsoft SharePoint is also a very good candidate for this purpose.

Integrating SharePoint with Orchestrator and SMA is not something new, many people have done this already. i.e.

System Center Universe America 2014 – Orchestrating Daily Tasks Like a Pro (by Pete Zerger and Anders Bengtsson)

Service Management Automation and SharePoint (by Christian Booth and Ryan Andorfer)

In my opinion, SharePoint (especially SharePoint lists) provides a quick and easy way to setup a web based end user portal for orchestration runbooks. I have also blogged my experiences in the past:

My Experience Manipulating MDT Database Using SMA, SCORCH and SharePoint

SMA Runbook: Update A SharePoint 2013 List Item

To me, not only I am using SharePoint 2013 in my lab; SharePoint Online from my Office 365 subscription, I also have no choice but using SharePoint 2010 in real life.

In my opinion, it is complicated to write SMA runbooks to interact with SharePoint (Using SharePoint web based APIs), not to mention the different versions of SharePoint also dictates how the runbook should be written. It is easier to use Orchestrator as a middle man in between SMA and SharePoint so we can use Orchestrator’s SharePoint integration pack.

Earlier this month, I was developing solutions to use SMA and Azure Automation to create OpsMgr Management Packs catalog on SharePoint 2013 / SharePoint Online sites. I have blogged the 2 solutions here:

On-Premise Solution (SMA + SharePoint 2013)

Cloud Based Solution (Azure Automation + SharePoint Online)

As I mentioned in the previous posts, I had to write a separate SMA module to be used in Azure Automation to interact with SharePoint Online because SharePoint Online sites require a different type of credential (SharePointOnlineCredential) that the PowerShell cmdlet Invoke-RESTMethod does not support. I called that module SharePointOnline back in the previous post and it utilises assemblies from the SharePoint Client Component SDK. I think the SharePoint people also refer to this SDK as Client-Side Object Model (CSOM)

After the MP catalogs posts were published, I have decided to spend a bit more time on the SharePoint Client Component SDK and see if it can help me simplify the activities between SMA and SharePoint. I was really happy to find out, the SharePoint Client Component SDK works for SharePoint 2013, SharePoint Online and SharePoint 2010 (limited). So I have decided to update and extend the original module, making it a generic module for all 3 flavours of SharePoint.

After couple of weeks of coding and testing, I’m pleased to announce the new module is now ready to be released. I have renamed this module to SharePointSDK (Sorry I’m not really creative with names Smile with tongue out).

 

SharePointSDK Module Introduction

The SharePointSDK module contains the following functions:

image

CRUD Operations for SharePoint List items:

Function Description Compatible SharePoint Version
Add-SPListItem Add an item to a SharePoint list 2010, 2013 and SP Online
Get-SPListFields Get all fields of a SharePoint list 2010, 2013 and SP Online
Get-SPListItem Get all list items of a SharePoint list or a specific item by specifying the List Item ID 2010, 2013 and SP Online
Remove-SPListItem Delete an item from a SharePoint list 2010, 2013 and SP Online
Update-SPListItem Update one or more field values of a SharePoint list item 2010, 2013 and SP Online

The functions listed above are the core functionalities this module provides. it provides simplified ways to manipulate SharePoint list items (Create, Read, Update, Delete).

Miscellaneous Functions

Function Description Compatible SharePoint Version
Import-SPClientSDK Load SharePoint Client Component SDK DLLs 2010, 2013 and SP Online
New-SPCredential Based on the type of SharePoint site (On-Prem vs SP Online), create an appropriate credential object to authenticate to the Sharepoint site. 2010, 2013 and SP Online
Get-SPServerVersion Get SharePoint server version 2010, 2013 and SP Online

These functions are called by other functions in the modules. It is unlikely that runbook authors will need to use them directly.

SharePoint List Attachments Operations

Function Description Compatible SharePoint Version
Add-SPListItemAttachment Add an attachment to a SharePoint list item 2013 and SP Online
Get-SPListItemAttachments Download all attached files from a SharePoint list item 2013 and SP Online
Remove-SPListItemAttachment Delete an attached file (based on file name) from a SharePoint list item 2013 and SP Online

As the names suggest, these functions can be used to manage attachments for SharePoint list items.

I’d like to point out  that the Add-SPListItemAttachment function not only support uploading an existing file to the SharePoint list item. it can also be used to create an attachment file directly using a byte array. This function can be used in 3 scenarios:

  • Uploading an existing file from the file system
  • Directly creating a text based file with some contents as a list item attachment.
  • Read the content of an existing binary (or text)  file, save it as a attachment with a different name

 

Configuration Requirements

Download and Prepare the module

The module zip file should consist the following 5 files:

image

  • Microsoft.SharePoint.Client.dll – One of required DLLs from the SDK
  • Microsoft.SharePoint.Client.Runtime.dll – One of required DLLs from the SDK
  • SharePointSDK.psd1 – Module Manifest file
  • SharePointSDK.psm1 – PowerShell module file
  • SharePointSDK-Automation.json – SMA Integration Module Meta File (where the connection asset is defined).

Download SharePointSDK Module

Note:

The zip file you’ve downloaded from the link above DOES NOT contain the 2 DLL files. I am not sure if Microsoft is OK with me distributing their software / intellectual properties. So, just to cover myself, you will need to download the SDK (64-bit version) from Microsoft directly (https://www.microsoft.com/en-us/download/details.aspx?id=35585), install it on a 64-bit computer, and copy above mentioned 2 DLLs into the SharePointOnline module folder.

Once the SDK is installed, you can find these 2 files in “C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI\” folder.

Once the DLLs are placed into the folder, zip the SharePointSDK folder to SharePointSDK.zip file again, and the integration module is ready.

image

Import Module

Once the DLLs are zipped into the module zip file, import the module into SMA by using the Import Module button under Assets tab

image

Create a Connection object to the SharePoint site

After the module has been successfully, a connection to SharePoint Site must be created. The Connection type is “SharePointSDK”

image

The following fields must be filled out:

  • Name: Name of the connection.
  • SharePointSiteURL: URL to your sharepoint site
  • UserName : a User who should be part of the site members role (members group have contribute access).
    • If the site is a SharePoint Onine site, this username MUST be in the email address format. (i.e. yourname@yourcompany.com). I believe this account must be an account created in the Office 365 subscription. I have tried using an outlook.com account (added as a SharePoint site member), it didn’t work.
    • When connecting to a On-Prem SharePoint site, you can use the Domain\UserName format (As shown in the screenshot below)
  • Password: Password for the username you’ve specified.
  • IsSharePointOnlineSite: Boolean field (TRUE or FALSE), specify if it is a SharePoint Online site.

i.e. the connection to a SharePoint site in my lab:

image

Sample Runbooks

In order to better demonstrate this module, I have also created 10 sample runbooks:

image

Download Sample runbooks

I’ll now go through each sample runbook.

Runbook: Sample-SPNewUserRequestList

This sample runbook creates a brand new dummy new users requests list on your SharePoint site. The list created by this runbook will then be used by other sample runbooks (for demonstration purposes).

This runbook is expecting 2 input parameters:

  • ListName: The Display Name that you’d like to name the new users requests list (i.e. New Users OnBoarding Requests).
  • SPConnection: The name of the SharePointSDK connection that you’ve created previously (i.e. Based on the connection I’ve created in my lab as shown previously, it is “RequestsSPSite”

image

This runbook creates a list with the following fields:

image

Runbook: Sample-SPGetListFields

This runbook demonstrates how to retrieve all the fields of a particular list.

image

Runbook: Sample-SPAddListItem

This runbook adds an item to the New Users Requests list the previous runbook created. It also demonstrates how to create a text file attachment directly to the list item (without having the need for an existing file on the file system).

It is expecting the following inputs:

  • Title (New users title, i.e. Mr. Dr. Ms, etc)
  • FirstName (New user’s first name)
  • LastName (New user’s last name)
  • Gender (New user’s Gender: Male / Female)
  • UserName (New user’s user vname)
  • AttachmentFileName (file name of the text based attachment)
  • TextAttachmentContent (content of the text file attachment)
  • NewUserListName (display name of the new users requests list. i.e. New Users OnBoarding Requests)
  • SPConnection (The name of the SharePointSDK connection that you’ve created previously (i.e. Based on the connection I’ve created in my lab as shown previously, it is “RequestsSPSite”)

i.e.

image

The list item is created on SharePoint:

SNAGHTML1d5a6df8

Attachment content:

image

Runbook: Sample-SPUpdateListItem

This runbook can be used to update fields of an existing list item on the New Users Requests list.

Runbook: Sample-SPGetAllListItems

This runbook can be used to retrieve ALL items from a list. Each list item are presented as a hash table.

image

Runbook: Sample-SPGetListItem

This runbook can be used to retrieve a single item from a list.

image

Runbook: Sample-SPDeleteListItem

This runbook deletes a single list item by specifying the List Item ID.

Runbook: Sample-SPAddListItemAttachment

This runbook demonstrates 2 scenarios:

  • Directly attaching a file to a list item
  • attach and rename a file to a list item

image

image

Runbook: Sample-SPDeleteListItemAttachments

This runbook demonstrates how to delete an attachment from a list item (by specifying the file name).

Runbook: Sample-SPDownloadListItemAttachments

This runbook demonstrates how to download all files attached to a list item:

image

Files downloaded to the destination folder:

image

Benefit of Using the SharePointSDK Module

Using as a Regular PowerShell Module

As we all know, SMA modules are simply PowerShell modules (sometimes with optional SMA module meta file .json for creating connections). Although this module is primarily written for SMA, it can also be used in other environments such as a regular PowerShell module or in Azure Automation. When using it as a normal PowerShell module, instead of passing the SMA connection name into the functions inside the module, you may provide each individual value separately (Username, password, SharePoint Site URL, IsSharePointOnlineSite).

Simplified scripts to interact with SharePoint

When using this module, most of the operations around the list item only takes very few lines of code.

i.e. Retrieving a list item:

Using PowerShell:

Using PowerShell Workflow (in SMA):

If you use SharePoint 2013’s REST API, the script will be much longer than what I’ve shown above.

Same Code for Different SharePoint Versions

The SharePoint REST API has been updated in SharePoint 2013. Therefore, if we are to use the REST API, the code for Share Point 2013 would look different than SharePoint 2010. Additionally, when throwing SharePoint Online into the mix, as I mentioned previously, it requires different type of credential for authentication, it further complicates the situation if we are to use the REST API. This makes our scripts and runbooks less generic.

By using this SharePointSDK module, I am able to use the same runbooks on SharePoint 2010, 2013 and SharePoint Online sites.

Limitations

During testing, I noticed the 3 attachments related functions in the SharePointSDK module would not work on SharePoint 2010 sites. These functions are:

  • Add-SPListItemAttachment
  • Remove-SPListItemAttachment
  • Get-SPListItemAttachments

After a bit of research, looks like it is a known issue. I didn’t think it too much a big deal because all the core functions (CRUD operations for the list items) work with SharePoint 2010. Therefore, in these 3 functions, I’ve coded a validation step to exit if the SharePoint Server version is below version 15 (SharePoint 2013):

image

Conclusion

If you are using SMA and SharePoint together, I strongly recommend you to download this module and the sample runbooks and give it a try. If you have a look at the sample runbooks, I’m sure you will realise how easy it is to write PowerShell code interacting with SharePoint.

In case you didn’t see the download links, you can download them here:

Download SharePointSDK Module

Download Sample Runbooks

Lastly, I’m not a SharePoint specialist. If you believe I’ve made any mistakes in my code, or there is room for improvement, I’d like to hear from you. Please feel free to drop me an email Smile.

Using Royal TS for PowerShell Remote Sessions

Written by Tao Yang

Background

I have used many Remote Desktop applications in the past. I have to say Royal TS is the one that I like the most! Recently, I showed it to one of my colleagues, after a bit of playing around, he purchased a license for himself too.

Today, my colleague asked me if I knew that Royal TS is also able to run external commands, and he thought it’s pretty cool that he’s able to launch PowerShell in the Royal TS window. Then I thought, if you can run PowerShell in Royal TS, we should be able to establish PS remote sessions in Royal TS too. Within 10 minutes, we managed to create few connections in Royal TS like these:

SNAGHTML1c209a8d

SNAGHTMLa497d178

image

SNAGHTML1c2e5543

In this post, I’ll go through the steps I took to set them up.

Connections to Individual Servers

To create a connection to an individual server,

01. Choose add->External Application:

image

02. Enter the following Details:

Display Name: The name of the server you want to connect to.

Command: C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe

Arguments: -NoExit -Command “Enter-PSSession $CustomField1$”

Working Directory: C:\Windows\System32\WindowsPowerShell\v1.0

On the icon button next to the display name, choose “Use Application Icon” if you want to.

image

image

03. Choose a Credential if you want to connect using an alternative credential

SNAGHTML1c5136c4

If you choose to use an alternative credential,  you must also tick “Use Credentials” box under Advanced tab:

image

04. Enter the remote server name in Custom Field 1:

image

Note: in the arguments field from step 01, I’ve used a Royal TS variable $CustomField1$ as the name of the computer in the Enter-PSSession command. It is more user friendly to use the Custom Field for the computer name, rather than modifying the argument string for each connection that you wish to create.

Create An Ad-Hoc Connection

You can also create a connection in Royal TS for Ad-Hoc connections. In this scenario, you will need to enter the remote computer that you wish to connect to:

image

After the the computer name has been entered, the connection is then established:

image

To create this connection in Royal TS, instead of using the Custom Field 1 for the computer name, I’ve added an additional PowerShell command in the Arguments:

Arguments: -NoExit -Command “$Computer = Read-Host ‘Please enter the Computer Name'; Enter-PSSession $Computer”

image

The Custom Field 1 is no longer required in this scenario. Everything else is the same as the previous sample (for individual computers).

Other Considerations

Maximised PowerShell Window

You may have noticed from the screenshots above, that the PowerShell windows are perfectly fitted in the Royal TS frame. this is because I am also using a customised PS Module that I’ve written in the past to resize the PoewerShell window. Without this module, the PowerShell console would not automatically fit into the Royal TS frame:

image VS image

If you like your console looks like the left one rather than one on the right, please follow the instruction below.

01. Download the PSConsole Module and place it under C:\windows\system32\WindowsPowerShell\v1.0\Modules

image

02. Modify the “All Users Current Host” profile from a normal PowerShell window (NOT within PowerShell ISE). If you are not sure if this profile has been created, run the command below:

image

After the profile is created, open it in notepad (in PowerShell window, type: Notepad $Profile.AllUsersCurrentHost) and add 2 lines of code:

image

After saving the changes, next time when you initiate a connection in Royal TS, the console will automatically maximise to use all the usable space.

Note: Because most likely you will be using an alternative (privileged credential) for these PS remote sessions. therefore the resize console commands cannot be placed into the default profile (current user current host). It must be placed into an All users profile. And also because the resize command only works in a normal PowerShell console (not in PowerShell ISE), therefore the only profile that you can use is the “All Users Current Host” profile from the normal PowerShell console.

Alternatively, if you do not wish to make changes to the All Users Current host profile, you can also add the above mentioned lines into the Royal TS connection arguments field:

i.e.

Arguments: -NoExit -Command “import-module psconsole; resize -max; Enter-PSSession $CustomField1$”

image

Duplicating Royal TS Connections

If you want to create multiple connections, all you need to do is to create the first one manually, and then duplicate it multiple times:

image

When duplicating connections, the only fields you need to change are the Display Name and CustomField1.

WinRM configuration

Needless to say, WinRM must be enabled and properly configured for PS remoting to work. this is a pre-requisite. I won’t go through how to configure WinRM here. Someone actually wrote a whole book on this topic.

Conclusion

I’d like to thank Stefan Koell (blog, twitter), the Royal TS developer (and also my fellow SCCDM MVP) for such an awesome tool. This is now probably THE most used application on all my computers Smile.

If you haven’t tried Royal TS out, please give it a try. Other than the obvious Windows version, there are also a Mac version, an iOS version and an Android version.