Managing Azure Automation Module Assets Using MyGet

Written by Tao Yang

SNAGHTML2756703

Background

Managing the life cycle of PowerShell module assets in your Azure Automation accounts can be challenging. If  you are currently using Azure Automation, you may have already noticed the following behaviours when managing the module assets:

1. It is difficult to automate the module asset deployment process.

If you want to automate the module deployment to your Automation Account (i.e. using the PowerShell cmdlet New-AzureRmAutomationModule), you must ensure the module that you are trying to import is zipped into a zip file and located on a public location where Azure Automation can read via HTTP (i.e. Azure Blob storage). In my opinion, this is over complicated.

2. Modules are not deployed to the Hybrid Workers automatically

If you are using Hybrid Workers, you must also manage the modules separately. Unlike Azure runbook workers, Azure Automation does not automatically deploy modules to Hybrid Workers. This means when you import a module to your Azure Automation account, you must also manually deploy it to your Hybrid Worker computers.

3. Difficult to maintain module version consistencies.

Since managing modules in your Azure Automation accounts and hybrid workers are two separate processes, it is hard to make sure the versions of your module assets are consistent between your automation account and hybrid worker computers.

Over the past few months, I have invested lot of my time on MyGet and looking for ways to close these gaps. Few months ago, I have released a PowerShell DSC Resource module called cPowerShellPackageManagement (http://blog.tyang.org/2016/09/15/powershell-dsc-resource-for-managing-repositories-and-modules/). By using this DSC resource module, we can easily develop DSC configurations for computers (such as Hybrid Workers) to automatically install modules from a PowerShell module repository (i.e. a MyGet feed). This approach closes the gaps of managing Hybrid Worker computers (item #2 on the list above). Today, I am going to discuss how we can tackle item #1 and #3. Before I start talking about my solutions, let me quickly introduce MyGet first.

What is MyGet?

Myget (www.myget.org) is a SaaS based package repository hosted on the cloud. It supports all the popular package providers such as NuGet, Npm etc. It can host both private and public repository (called a feed) for you or your organisation.

If you come from a developer or DevOps background, you may have already heard about MyGet in the past, or have used similar on-premises package repositories (such as ProGet). If you are an IT Pro, since you are reading this blog post right now, you must be familiar with PowerShell, therefore must have heard or used PowerShell Gallery (https://powershellgallery.com). You can use MyGet the same way as PowerShell Gallery in PowerShell version 5 and later, except you have absolute control of the content in your feeds. Also,  if you are using a paid MyGet account, you can have private feeds and you can control the access by issuing API keys. You can also create multiple feeds that contain different packages (PowerShell modules in this case). i.e. if you develop PowerShell modules, you can have a Dev feed for you to use during development, and also Test and Production feeds for testing and production uses.

Why Do I Need MyGet?

You may be a little bit hesitate to use PowerShell Gallery because it is 100% public. As a regular user like everyone else, you can only do very little. i.e. you can publish modules to PowerShell gallery, but you can’t guarantee your modules will stay there forever. Microsoft may decide to un-list your modules if they find problems with it (i.e. failed to comply with the rules set in the PSScriptAnalyzer). You also don’t have access to delete your modules from PowerShell Gallery. You can un-list your modules, but they are still hosted there. To me, PowerShell Gallery is more like a community platform that allows everyone to share their work, but you should not use it in any production environments because you don’t have any controls on the content, how can you make sure the content you need is going to be there tomorrow?

MyGet allows you to create feeds that you have total control, and as I mentioned already, with a paid MyGet account, you can have private feeds to host your IPs that you don’t want to share with the rest of the world.

MyGet also ships with other awesome features, such as Webhook support.

Automating Module Deployment to Automation Account

I have developed a runbook that retrieves a list of modules from a repository (i.e. your MyGet feed), and import each module to the Automation account of where the runbook resides, if the module does not exist or the version is lower than the latest available version from the module repository. Before importing, the runbook also tries to work out the module dependencies and import required modules in groups (i.e. the modules without dependencies are imported first).  Here’s the runbook source code:

Note: this runbook does not download and zip up PowerShell modules from the repository feed. Instead, it construct the URI to the underlying NuGet package and import the package directly to your automation account.

In order to use the runbook, you will need to create a automation variable first.

Name: ModuleFeedLocation

Value: <the source location URI to your repository feed>

image

Note: if you are not sure what is the source location URI for your feed, check out this help document from MyGet website: http://docs.myget.org/docs/how-to/publish-a-powershell-module-to-myget. However, I don’t believe the documentation is 100% accurate. Based on my experience, no matter if you are using private or public feeds, the Source location URI should be:

https://www.myget.org/F/<feed-name>/auth/<api-key>/api/v2

The API key is available on the MyGet portal:

image

if  you have connected to the feed as a PowerShell repository, you can also check using Get-PSRepository cmdlet:

image

Other than the automation variable, you will also need to make sure you have the AzureRunAsConnection connection asset and associated certificate asset created. these assets are created automatically by default when you created your Azure Automation account:

image

If you don’t have this connection asset, you can manually create it using PowerShell – this process is documented here: https://docs.microsoft.com/en-au/azure/automation/automation-sec-configure-azure-runas-account

Once the runbook and all required assets are in place, you will also need to create a webhook for the runbook. It is OK to configure the webhook to target Azure workers (although targeting hybrid worker group is also OK, but not necessary).

image

Once the webhook is created, go to MyGet portal, go to your feed then go to the Webhook section and add a HTTP POST webhook

image

then enter a description and paste the runbook webhook URL. for the webhook trigger, only tick “Package Added”:

image

image

Once the webhook trigger is created, everything is good to go. when next time you add a PowerShell module or update an existing module on your MyGet feed, it will automatically trigger the Azure Automation runbook, which will find the modules need to be imported and updated, and attempt to import them one a time.

image

Tips:

Once you have configured your MyGet feed as a PowerShell repository on a computer running PowerShell v 5 or later, you can publish modules located on your local computer to the feed using Publish-Module cmdlet. You can also configure MyGet to get modules from another repository such as PowerShell Gallery. I have blogged this previously: http://blog.tyang.org/2016/09/20/pushing-powershell-modules-from-powershell-gallery-to-your-myget-feeds-directly/

If you want to configure multiple Automation accounts to sync with a single MyGet feed, you can simply create the runbook and required assets in each automation account, and add a webhook trigger for each instance of the runbook within your MyGet feed.

Things to Watchout

there are few things that you need to watch out when using this solution:

1. be aware of the limitations in Azure Automation

Some of these limitations may impact your module imports. you can find the official documentation here:  https://docs.microsoft.com/en-us/azure/azure-subscription-service-limits#automation-limits

2. Unlike any NuGet repositories such as PowerShell Gallery and MyGet, Azure Automation does not support storing different versions of same module

This may cause some of the module imports to fail. For example, if you have a module called ModuleA (version 1.0) that is a dependency to ModuleB version 1.0. You have ModuleA 1.0, ModuleB 1.0 and 2.0 in your MyGet repository, the runbook will firstly import ModuleB 2.0 to your automation account first. then when it tries to import ModuleA 1.0, it may fail because it does not pass the validation test (by importing ModuleA 1.0 on the runbook worker computer). so prior to committing these kind of packages to a feed that’s being used by Azure Automation, make sure you test it first on another feed, and make sure you can successfully install and import the module on your local computer.

3. Do not load too many modules to the feed initially

Module import into Azure Automation account takes a lot of time. when running a runbook job on Azure workers, the runbook can run maximum 3 hours due to its fair share policy. so if you have a lot of modules to load in the beginning, you need to make sure the runbook job can be completed within 3 hours. or you may have to rerun the runbook to pickup the modules didn’t get imported in the previous runbook job. Alternatively, you can configure the runbook to run on a Hybrid Worker group, because the fair share policy does not apply when the job is being executed on hybrid workers.

Conclusion

If you use a dedicated MyGet feed to host all required modules for Azure Automation, you can use the cPowerShellPackageManagement DSC resource module I mentioned earlier in this blog post to automate the module deployment to Hybrid Workers. In the same time, by using the method described in this blog post, you have also got the Automation account covered.

Therefore, if you have both DSC configured for Hybrid Workers (i.e using Azure Automation DSC), and have this runbook and webhook configured, by adding a new package to your MyGet feed, your entire Azure Automation infrastructure is updated automatically.

My MVP buddy Alex Verkinderen also also done some interesting integration between MyGet and PowerShell Gallery. He is going to publish his innovation on his blog (http://www.mscloud.be/) soon, so make sure you subscribe to his blog Smile.

Lastly, thanks Alex for testing the runbook for me, and if anyone has any questions or suggestions, please feel free to contact me.

PowerShell Script to Import and Update Modules from PowerShell Repositories to Azure Automation

Written by Tao Yang

PowerShell Gallery has a very cool feature that allows you to import modules directly to your Azure Automation Account using the “Deploy to Azure Automation” button. However, if you want to automate the module deployment process, you most likely have to firstly download the module, zip it up and then upload to a place where the Azure Automation account can access via HTTP. This is very troublesome process.

I have written a PowerShell script that allows you to search PowerShell modules from ANY PowerShell Repositories that has been registered on your computer and deploy the module DIRECTLY to the Azure Automation account without having to download it first. You can use this script to import new modules or updating existing modules to your Automation account.

This script is designed to run interactively. You will be prompted to enter details such as module name, version, Azure credential, selecting Azure subscription and Azure Automation account etc.

The script works out the URI to the actual NuGet package for the module and import it directly to Azure Automation account. As you can see from above screenshot, Other than the PowerShell Gallery, I have also registered a private repository hosted on MyGet.org, and I am able to deploy modules directly from my private MyGet feed to my Azure Automation account.

If you want to automate this process, you can easily make a non-interactive version of this script and parameterize all required inputs.

So, here’s the script, and feedback is welcome:

Command Launching Microsoft Monitoring Agent Control Panel Applet

Written by Tao Yang

I have been refreshing my lab servers to Windows Server 2016. I’m using the Non GUI version (Server Core) wherever is possible.

When working on Server Core servers, I found it is troublesome that I can’t access the Microsoft Monitoring Agent applet in Control Panel:

image

Although I can use PowerShell and the MMA agent COM object AgentConfigManager.MgmtSvcCfg, Sometime it is easier to use the applet.

After some research, I found the applet can be launched using command line:

image

OpsMgr Alert Tuning using OpsLogix EZalert

Written by Tao Yang

EZAlert

OpsLogix has recently released a new product to the market called “EZalert”. It learns the operator’s alert handling behaviour and then it is able to automatically update Alert resolution states based on its learning outcome. You can find more information about this product here: http://www.opslogix.com/ezalert/. I was given a trail license for evaluation and review. Today I installed it on a dedicated VM and connected it to my lab OpsMgr management group.

EZalert Walkthrough

Once installed, I could see a new dashboard view added in the monitoring pane, and this is where we tune all the alerts:

image

From this view, I can see all the active alerts, and I can start tuning then either one at a time, or I can multiple select and set desired state in bulk. Once I have gone through all the alerts on the list, I can choose to save the configuration under the Settings tab:

image

image

Once this is done, any new alerts that have previously been trained will be updated automatically when it was generated. i.e. I have created a test alert and trained EZalert to set the resolution state to Closed, as you can see below, it was created at 9:44:57AM and modified by EZalert 2 seconds later:

image

Once the initial training process is completed and saved, the training tab will become empty. Any new alerts generated will show up in the training tab, and you can see if there’s a suggested state assigned, and you can also modify it by assigning another state:

image

And all previously trained alerts can be found in the history tab:

image

You can also create exclusions. if you want EZalert to skip certain alerts for certain monitoring object (i.e. Disk space alert generated on C:\ on Server A), you can do so by creating exclusions:

image

image

In my opinion, this is a very good practice when tuning alerts. when setting alert resolution states, you only need to do it once, and EZalert learns your behaviour and repeat your action for you in the future. It will be a huge time saver for all your OpsMgr operators over the time. It will also become very handy for alert tuning in the follow situations:

  • When you have just deployed a new OpsMgr management group
  • When you have introduced new management packs in your management group
  • When you have updated existing management packs to the newer versions

EZalert vs Alert Update Connector

Before EZalert’s time, I have been using the OpsMgr Alert Update Connector (AUC) from Microsoft (https://blogs.technet.microsoft.com/kevinholman/2012/09/29/opsmgr-public-release-of-the-alert-update-connector/). I was really struggling when configuring AUC so I developed my own solution to configure AUC in an automated fashion  (http://blog.tyang.org/2014/04/19/programmatically-generating-opsmgr-2012-alert-update-connector-configuration-xml/) and I have also developed a management pack to monitor it (http://blog.tyang.org/2014/05/31/updated-opsmgr-2012-alert-update-connector-management-pack/). In my opinion, AUC  is a solid solution. It’s been around for many years and being used by many customers. But I do find it has some limitations:

  • Configuration process is really hard
  • Configuration is based on rules and monitors, not alerts. So it’s easy to incorrectly configure rules and monitors that don’t generate alerts (i.e. perf / event collection rules, aggregate / dependency monitors, etc).
  • Modifying existing configuration causes service interrupt due to service restart
  • When running in a distributed environment (on multiple management servers), you need to make sure configuration files are consistent across these servers and only one instance is running at any given time.
  • No way to easily view the current configurations (without reading XML files)

I think EZalert has definitely addressed some of these shortcomings:

  • Alert training process is performed on the OpsMgr console
  • No need to restart services and reload configuration files after new alerts are added or when existing alerts are modified
  • Configurations are saved in a SQL database, not text based files
  • Current configuration are easily viewable within the SCOM console

However, AUC has the following advantages over EZalert:

  • AUC supports assigning different values to different groups or individual objects. In EZalert, the exception can only be created for individual monitoring objects and it doesn’t seem like you can assign different value for this object, it’s simply on/off exception
  • Other than Alert resolution state, AUC can also be used to update other alert properties (i.e. custom fields, Owner, ticket ID,  etc.). EZalert doesn’t seem like it can update other alert fields.

Things to Consider

When using EZalert, in my opinion, there are few things you need to consider:

1. It does not replace requirements for overrides

If you are training EZalert to automatically close an alert when it’s generated, then you should ask yourself – do you really need this alert to be generated in the first place? Unless you want to see these alerts in the alert statistics report, you should probably disable this alert via overrides. EZalert should not be used to replace overrides. if you don’t need this alert, disable it! it saves resources on both SCOM server and agent to process alert, and database space to store the alert.

2. Training Monitor generated alerts

As we all know, we shouldn’t manually close monitor generated alerts. So when you are training monitor alerts, make sure you don’t train EZalert to update the resolution state to “Closed”. consider using other states such as “Resolved”.

3. Create Scoped roles for normal operators in order to hide the EZalert dashboard view

You may not want normal operators to train alerts, so instead of using the built-in operators role, you’d better create your own scoped role and hide the EZalert dashboard view from normal operators

Conclusion

I believe EZalert has some strong use cases. Unless you have a very complicated alert flow automation process that leverages other alert fields such as custom fields, owner, etc. (i.e. for generating tickets, etc) and you are currently using AUC for this particular reason, I think EZalert gives you a much more user friendly experience for ongoing alert tuning.

I have personally implemented AUC in few places, and I still get calls every now and then from those places asking help with AUC configuration and it’s been few years since it was implemented. Also I’m not exactly sure if AUC is officially supported by Microsoft because it was originally developed by an OpsMgr PFE at this spare time (I’m not entirely sure about the supportability of AUC, maybe someone from MSFT can confirm). Whereas EZalert is a commercial product, the vendor OpsLogix provide full support of  it.

lastly, if you have any questions about EZalert, please feel free to contact OpsLogix directly.

New Blogger in the Family

Written by Tao Yang

I have been blogging for six and half years and to this moment, I’m still enjoying it. Few months ago my better half has decided to start blogging as well. Although my partner also works in IT as a project manager, her real passions are photography and cooking. She has decided to start a blog focused on food and recipes. By doing this, not only she gets to create her favorite dishes, she also gets to take pictures too.

Then there was a lot of preparation to get her started. I helped her registered her chosen domain name, got a WordPress site hosted on the same hoster as my blog and company website, and also bought a lot of cooking, photo and recording equipment. Now her site is up, and she has already posted 4 recipes. You can check it out on http://www.lemontaste.com.au

You can also follow her on the social media:

Facebook page: https://www.facebook.com/lemontasteblog/

Twitter: @Lemontaste_blog

Instagram: @Lemontaste_blog

Please feel free to share the links with your friends and family, it will be much appreciated!

For those who know me and my partner well on a personal level, hopefully you all agree that she is amazing when comes to cooking. Even my 4 year old daughter said to me that “Daddy is a cook, Mummy is a chef!” Together, we have already come up with over 20 recipes that she can blog about. However, given the time and effort required for each blog post, unlike my blog articles, she won’t be able to blog as fast as me. But I promise that I’ll keep reminding her and help her to get them published one at a time. At the end of the day, I really enjoy these blog posts too because I get to eat the leftover from her blog posts – after the photos been taken, then it’s all mine!

Here are some of her most recent dishes (all photos were taken by herself), there are few more on her blog:

DSC_1641-1

DSC_1710

Lastly, if you have any questions, feel free to contact her directly. She will be more than happy to answer them.

Speaking at ExpertsLive Australia in April

Written by Tao Yang

el_au_logo

As many of you may already know, the legendary System Center Universe franchise has been merged with ExpertsLive, which is also a popular community event over in Europe. As the result, the upcoming SCU Australia has been renamed to ExpertsLive Australia. Unlike last year, instead of just an one-day event, it is going to be a two day event, with 3 tracks: Cloud, Data Center and Enterprise Client Management. ExpertsLive Australia 2017 is going to be held at Crown Promenade Melbourne on 6th and 7th of April.

I will co-present 2 sessions with my friend and MVP veteran Pete Zerger (@pzerger). Our topics are:

  1. Discoverying and monitoring your network topology using OMS
  2. Cloud Based Automation Overview

There are many local and international speakers have already been confirmed, such as Alex Verkinderen, James Bannan, Thomas Maurer, Marcel Zehner, David Obrien, and more!

Make sure you check out the event website: http://www.expertslive.org.au and hopefully I’ll see you all there!

cPowerShellPackageManagement DSC Resource Updated to Version 1.0.1.0

Written by Tao Yang

Few days ago I found a bug in the cPowerShellPackageManagement DSC resource module that was caused by the previous update v1.0.0.1.

in version 1.0.0.1, I’ve added –AllowClobber switch to the Install-Module cmdlet, which was explained in my previous post: http://blog.tyang.org/2016/12/16/dsc-resource-cpowershellpackagemanagement-module-updated-to-version-1-0-0-1/

However, I only just noticed that despite the fact that the pre-installed version of the PowerShellGet module on Windows Server 2016 and in WMF 5.0 for Windows Server 202 R2, the install-module cmdlet is sightly different. The pre-installed version of PowerShellGet module is 1.0.0.1, and in Windows 10 and Windows Server 2106, Install-Module cmdlet has the “AllowClobber” switch:

image

In Windows Server 2012, the Install-module cmdlet does not have –AllowClobber switch:

image

Therefore I had to update the DSC resource to detect the if AllowClobber switch exists.

Additionally, I have made few additional stability improvements, and added dependency to the PowerShellGet module in the module manifest file.

This updated version can be found on both GitHub and PowerShell Gallery:

Github: https://github.com/tyconsulting/PowerShellPackageManagementDSCResource/releases/tag/1.0.1.0

PowerShell Gallery: https://www.powershellgallery.com/packages/cPowerShellPackageManagement/1.0.1.0

PowerShell Script to Create OMS Saved Searches that Maps OpsMgr ACS Reports

Written by Tao Yang

Microsoft’s PFE Wei Hao Lim has published an awesome blog post that maps OpsMgr ACS reports to OMS search queries (https://blogs.msdn.microsoft.com/wei_out_there_with_system_center/2016/07/25/mapping-acs-reports-to-oms-search-queries/)

There are 36 queries on Wei’s list, so it will take a while to manually create them all as saved searches via the OMS Portal. Since I can see that I will reuse these saved searches in many OMS engagements, I have created a script to automatically create them using the OMS PowerShell Module AzureRM.OperationalInsights.

So here’s the script:

You must run this script in PowerShell version 5 or later. Lastly, thanks Wei for sharing these valuable queries with the community!

OMSDataInjection Updated to Version 1.2.0

Written by Tao Yang

The OMSDataInjection module was only updated to v1.1.1  less than 2 weeks ago. I had to update it again to reflect the cater for the changes in the OMS HTTP Data Collector API.

I only found out last night after been made aware people started getting errors using this module that the HTTP response code for a successful injection has changed from 202 to 200. The documentation for the API was updated few days ago (as I can see from GitHub):

image

This is what’s been updated in this release:

  • Updated injection result error handling to reflect the change of the OMS HTTP Data Collector API response code for successful injection.
  • Changed the UTCTimeGenerated input parameter from mandatory to optional. When it is not specified, the injection time will be used for the TimeGenerated field in OMS log entry.

If you are using the OMSDataInjection module, I strongly recommend you to update to this release.

PowerShell Gallery: https://www.powershellgallery.com/packages/OMSDataInjection

GitHub: https://github.com/tyconsulting/OMSDataInjection-PSModule/releases/tag/v1.2.0

DSC Resource cPowerShellPackageManagement Module Updated to Version 1.0.0.1

Written by Tao Yang

Back in September this year, I published a PowerShell DSC resource called cPowerSHellPackageManagement. This DSC resource allows you to manage PowerShell repositories and modules on any Windows machines running PowerShell version 5 and later. you can read more about this module from my previous post here: http://blog.tyang.org/2016/09/15/powershell-dsc-resource-for-managing-repositories-and-modules/

Couple of weeks ago my MVP buddy Alex Verkinderen had some issue using this DSC resource in Azure Automation DSC. After some investigation, I found there was a minor bug in the DSC resource. When you use this DSC resource to install modules, sometimes you may get an error like this:

image

Basically, it is complaining that a cmdlet from the module you are trying to install already exists. In order to fix it, I had to update the DSC resource and added –AllowClobber switch to the Install-Module cmdlet.

I have published the updated version to both PowerShell Gallery (https://www.powershellgallery.com/packages/cPowerShellPackageManagement/1.0.0.1) and GitHub (https://github.com/tyconsulting/PowerShellPackageManagementDSCResource/releases/tag/1.0.0.1)

If you are using this DSC resource at the moment, make sure you check out the update.