Tag Archives: MimbolovePowershell

Pushing PowerShell Modules From PowerShell Gallery to Your MyGet Feeds Directly

Written by Tao Yang

PSGallery-MyGet

Recently I have started using a private MyGet feed and my cPowerShellPackageManagement DSC Resource module to manage PowerShell modules on my lab servers.

When new modules are released in PowerShell Gallery (i.e. all the Azure modules), I’d normally use Install-Module to install on test machines, then publish the tested modules to my MyGet feed and then my servers would pick up the new modules.

Although I can use public-module cmdlet to upload the module located locally on my PC to MyGet feed, it can be really time consuming when the module sizes are big (i.e. some of the Azure modules). It only took me few minutes to figure out how do I push modules directly from PowerShell Gallery (or any NuGet feeds) to my MyGet feed.

To configure it, Under the MyGet feed, go to “Package Sources”, and click “Add package source…”

SNAGHTML6b70b9f

Then choose NuGet feed, fill out name and source

Name: PowerShellGallery

Source: https://www.powershellgallery.com/api/v2/

image

Once added, I can search PowerShell Gallery and add packages directly to MyGet.

image

image

Scripting Azure Automation Module Imports Directly from MyGet or PowerShell Gallery

Written by Tao Yang

There are few ways to add PowerShell modules to Azure Automation accounts:

1. Via the Azure Portal by uploading the module zip file from local computer.

image

2. If the module is located in PowerShell Gallery, you can push it to your Automation Account directly from PowerShell Gallery.

image

3. Use PowerShell cmdlet New-AzureRmAutomationModule from the AzureRM.Automation module.

One of the limitation of using New-AzureRMAutomationModule cmdlet is, the module must be zipped and located somewhere online that Azure has access to. You will need to specify the location by using the –ContentLink parameter. In the past, in order to script the module deployment, even when the module is located in PowerShell Gallery, I had to save the module to a place where my Automation Account has access to (such as an Azure blob storage, or creating a release in a public Github repo).

Tonight, I was writing a script and I wanted to see if I can deploy modules to my Automation Account directly from a package repository of my choice – other than PowerShell Gallery, I also have a private MyGet feed that I use for storing my PowerShell modules.

It turned out to be really easy to do so, only took me few minutes to figure out how. I’ll use a module I wrote in the past called “SendEmail” as an example. It is published in both PowerShell Gallery, and my private MyGet feed.

Importing from PowerShell Gallery

the URL for this module in PowerShell Gallery is: https://www.powershellgallery.com/packages/SendEmail/1.3

The –ContentLink URI that we need to pass to the Add-AzureRmAutomationModule cmdlet would be:

https://www.powershellgallery.com/api/v2/package/SendEmail/1.3.

As you can see, all you need to do is to add “api/v2/” in the URI. The PowerShell command would be something like this:

Importing from a private MyGet feed

For a private MyGet feed, you can access it by embedding the API key into the URL:

image

The URL for my module would be: “http://www.myget.org/F/<Your MyGet feed name>/auth/<MyGet API Key>/api/v2/package/<Module Name>/<Module Version>

i.e. for my SendEmail module, the PowerShell command would be something like this:

Importing from a public MyGet feed

If the module is located in a public MyGet feed, then the API key is not required. the URI for the module would be very similar to PowerShell Gallery, you will just need to embed “api/v2/” in to the original URI:

‘https://www.myget.org/F/<MyGet Public Feed Name>/api/v2/package/<Module Name>/<Module Version>

the PowerShell script would be something like this:

PowerShell DSC Resource for Managing Repositories and Modules

Written by Tao Yang

256x256Introduction

PowerShell version 5 has introduced a new feature that allows you to install packages (such as PowerShell modules) from NuGet repositories. If you have used cmdlets such as Find-Module, Install-Module or Uninstall-Module, then you have already taken advantage of this awesome feature.

By default, a Microsoft owned public repository PowerShell Gallery is configured on all computers running PowerShell version 5 and when you use Find-Module or Install-Module, you are pulling the modules from the PowerShell Gallery.

Ever since I started using PowerShell v5, I’ve discovered some challenges managing modules for machines in my environment:

  • Lack of a fully automated way to push modules to a group of computers
  • Module version inconsistency between computers
  • Need of a private repository

Let me elaborate each of the point listed above.

Lack of a fully automated way to push modules to a group of computers

Back in the old days (pre WMF v5), I used to package PowerShell modules to msi’s and use ConfigMgr to deploy the msi to target computers. although it’s not too hard to a package module to msi, this method is really time consuming, not to mention it also requires ConfigMgr. In PowerShell v5, I can write a script that utilise PowerShell remoting to push modules to remote machines, this is still a manual process, and it may not be a viable solution for a large group of computers.

Module version inconsistency between computers

over the time, modules get updated, new modules get released from various sources. I often find module version become inconsistent among computers. there is no automated ways to update computers when a new version is released.

Need of a private repository

PowerShell Gallery is public. everything you publish to it will be available for the entire world. Organisations often write modules specifically for internal use, and may not want to share it with the rest of the world.

Before I dive into the main topic, I’d like to discuss what I have done for implementing private repositories.

Private Repositories

PowerShell PackageManagement uses NuGet repositories. I found the following solutions available:

MyGet is a SaaS (Software as a Service) based repository hosted on the cloud. Although you can create your own feeds, private feeds come with a price tag (free accounts allow you to create public feeds that everyone can access).

ProGet is a on-premises solution. To install it, you will need a web server (and optionally a SQL server) within your network. It comes with free, basic and enterprise editions. the feature comparison is located here: http://inedo.com/proget/pricing/features-by-edition

Since both MyGet and ProGet offer NFR (Not For Resell) licenses to Microsoft MVPs, I have tested both for my lab environment. They both work pretty well. I did not bother to setup the free private NuGet repository (the 3rd option).

These days, I found myself writing more and more PowerShell modules for different projects. During development phase, I’d normally use a feed that’s hosted on my ProGet server because it is located in my lab, so it’s faster to publish and download modules. Once the module is ready, I’d normally publish it to MyGet for general consumption because it’s a SaaS based application, both my lab machines and Azure IaaS machines will have no problem accessing it.

DSC Resource cPowerShellPackageManagement

In order to overcome the other two challenges that I’m facing (module automatically deployment and version inconsistency), I have created a DSC resource called cPowerShellPackageManagement.

According to the DSC namingstandard, the first letter ‘c’ indicates it is a community resource, and as the rest of the name suggests, it is used to manage PowerShell packages.

This DSC resource module contains 2 resources:

  • cPowerShellRepository – used to register or unregister specific NuGet feeds on computers running PowerShell v5 and above.
  • cPowerShellModuleManagement – used to install / uninstall modules on computers running PowerShell v5 and aove

cPowerShellRepository

Syntax:

To register a feed, you will need to specify some basic information such as PublishLocation and SourceLocation. You can also set Ensure = Absent to unregister the feed with the name specified in the Name parameter.

When not specified, the InstallationPolicy field default value is “Untrusted”. If you’d like to set the repository as a trusted repository, set this value to “Trusted”.

Note: since the repository registration is based on each user (as opposed to machine based settings) and DSC configuration is executed under LocalSystem context. you will not be able to see the repository added by this resource if you run Get-PSRepository cmdlet under your own user account. If you start PowerShell under LocalSystem by using PsExec (run psexec /i /s /d powershell.exe), you will be able to see the repository:

image

cPowerShellModuleManagement

Syntax:

  • PSModuleName – PowerShell module name. When this is set to ‘all’, all modules from the specified repository will be installed. So please do not use ‘all’ against PSGallery!!
  • RepositoryName – Name of the repository where module will be installed from. This can be a public repository such as PowerShell Gallery, or your privately owned repository (i.e. your ProGet or MyGet feeds). You can use the cPowerShellRepository resource to configure the repository.
  • PSModuleVersion – This is an optional field. when used, only the specified version will be installed (or uninstalled). If not specified, the latest version of the module from the repository will be used. This field will not impact other versions that are already installed on the computer (i.e. when installing the latest version, earlier versions will not be uninstalled).
  • MaintenanceStartHour, MaintenanceStartMinute and MaintenanceLengthMinute – Since the LCM will run the DSC configuration on a pre-configured interval, you may not want to install / uninstall modules during business hours. Therefore, you can set the maintenance start hour (0-23) and start minute (0-59) to specify the start time of the maintenance window. MaintenanceLengthMinute represents the length of the maintenance window in minutes. These fields are optional, when specified, module installation and uninstallation will only take place when the LCM runs the configuration within the maintenance window. Note: Please make sure the MaintenanceLengthMinute is greater than the value configured for the LCM ConfigurationModeFrequencyMins property.

image

Sample Configuration

Here are some sample configurations to demonstrate the usage of these DSC resources.

1. Register to an On-Prem ProGet feed and install all modules from the feed

Using this configuration, I can manage the modules from the repository feed level. if I add or update a module to the feed, the DSC LCM on each configured compute will automatically install the newly added (or updated) module when next time the configuration is refreshed.

2. Register to a feed hosted on MyGet, and install several specific modules

In this example, I’ve specified a particular module can be installed at any time (the Gac module), and another module can only be installed (or updated) at a specific time window (the SharePointSDK module).

Download and Install Locations

This DSC Resource has been published to PowerShellGallery: https://www.powershellgallery.com/packages/cPowerShellPackageManagement

The project is also located on Github: https://github.com/tyconsulting/PowerShellPackageManagementDSCResource

Special Thanks

I’d like to thank my MVP friends Jakob G Svendsen (@JakobGSvendsen), Pete Zerger (@pzerger), Daniele Grandini (@DanieleGrandini) and James Bannan (@JamesBannan) who provided feedback and helped me testing the modules.

PowerShell Module for OMS HTTP Data Collector API

Written by Tao Yang

Background

Earlier today, the OMS Product Group has released the OMS HTTP Data Collection API to public preview. If you haven’t read the announcement, you can read this blog post written by the PM of this feature, Evan Hissey first.

As a Cloud and Datacenter Management MVP, I’ve had private preview access to this feature for few months now, and I actually even developed a solution using this API in a customer engagement with my friend and fellow CDM MVP Alex Verkinderen (@AlexVerkinderen) just over a month ago. I was really impressed with the potential opportunities this feature may bring to us, I’ve been spamming Evan’s inbox asking him for the release date of this feature so I can blog about it and also present this in user group meetups.

Since most of us wouldn’t like having to deal with HTTP headers, bodies, authorizations and other overhead we have to put into our code in order to use this API, I have developed a PowerShell module to help us easily utilize this API.

Introducing OMSDataInjection PowerShell Module

This module was developed about 2 months ago, I was waiting for the API to become public so I can release this module. So now the wait is over, I can finally release it.

This module contains only one public function: New-OMSDataInjection. This function is well documented in a proper help file. you can access it via Get-Help New-OMSDataInjection –Full. I have added 2 examples in the help file too:

————————– EXAMPLE 1 ————————–

PS C:\>$PrimaryKey = Read-Host -Prompt ‘Enter the primary key’
$ObjProperties = @{
Computer = $env:COMPUTERNAME
Username = $env:USERNAME
Message  = ‘This is a test message injected by the OMSDataInjection module. Input data type: PSObject’
LogTime  = [Datetime]::UtcNow
}
$OMSDataObject = New-Object -TypeName PSObject -Property $ObjProperties
$InjectData = New-OMSDataInjection -OMSWorkSpaceId ‘8eb61d08-133c-401a-a45b-0e611194779f’ -PrimaryKey $PrimaryKey -LogType ‘OMSTestData’ -UTCTimeStampField ‘LogTime’ -OMSDataObject $OMSDataObject

Injecting data using a PS object by specifying the OMS workspace Id and primary key
————————– EXAMPLE 2 ————————–

PS C:\>$OMSConnection = Get-AutomationConnection ‘OMSConnection’
$OMSDataJSON = @”
{
“Username”:  “administrator”,
“Message”:  “This is a test message injected by the OMSDataInjection module. Input data type: JSON”,
“LogTime”:  “Tuesday, 28 June 2016 9:08:15 PM”,
“Computer”:  “SERVER01”
}
“@
$InjectData = New-OMSDataInjection -OMSConnection $OMSConnection -LogType ‘OMSTestData’ -UTCTimeStampField ‘LogTime’ -OMSDataJSON $OMSDataJSON

Injecting data using JSON formatted string by specifying the OMSWorkspace Azure Automation / SMA connection object (to be used in a runbook)

This PS module comes with the following features:

01. A Connection object for using this module in Azure Automation and SMA.

Once imported into your Azure Automation account (or SMA for the ‘old skool’ folks), you will be able to create connection objects that contains your OMS workspace Id, primary key and secondary key (optional):

image

And as shown in Example 2 listed above, in your runbook, you can retrieve this connection object and use it when calling the New-OMSDataInjection function.

02. Fall back to the secondary key if the primary key has failed

When the optional secondary key is specified, if the web request using the primary key fails, the module will fall back to the secondary key and try the web request again using the secondary key. This is to ensure your script / automation runbooks will not be interrupted when you are in the process of  following the best practice and cycling through your keys.

03. Supports two types of input: JSON and PSObject

As you can see from Evan’s post, this API is expecting a JSON object as the HTTP body which contains the data to be injected into OMS. When I started testing this API few months ago, my good friend and fellow MVP Stanislav Zhelyazkov (@StanZhelyazkov) suggested me instead of writing plain JSON format, it’s better to put everything into a PSObject then convert it to JSON in PowerShell so we don’t mess with the format and type of each field. I think it was a good idea, so I have coded the module to take either JSON format, or a PSObject that contains the data to be injected into OMS.

Sample Script  and Runbook

I’ve created a sample script and a runbook to help you get started. They are also included in the Github repository for this module (link at the bottom of this article):

Sample Script: Test-OMSDataInjection.ps1

Sample Runbook: Test-OMSDataInjectionRunbook

Exploring Data in OMS

Once the data is injected into OMS, if you are using a new data type,  it can take a while (few hours) for all the fields to be available in OMS.

i.e. the data injected by the sample script and Azure Automation runbook (executed on Azure):

image

all the fields that you have defined are stored as custom fields in your OMS workspace:

image

Please keep in mind, since the Custom Fields feature is still at the preview phase, there’s a limit of 100 custom fields per workspace at this stage (https://azure.microsoft.com/en-us/documentation/articles/log-analytics-custom-fields/), so please be mindful of this limitation when you are building your custom solutions using the HTTP Data Collector API.

Where to Download This Module?

I have published this module in PowerShell Gallery: https://www.powershellgallery.com/packages/OMSDataInjection, if you are using PowerShell version 5 and above, you can install it directly from it: Install-Module –Name OMSDataInjection –Repository PSGallery

You can also download it from it’s GitHub repo: https://github.com/tyconsulting/OMSDataInjection-PSModule/releases

Summary

In the past, we’ve had the OMS Custom View Designer that can help us visualising the data that we already have in OMS Log Analytics, what we were missing is a native way to inject data into OMS. Now with the release of this API, the gap has been filled. Like Evan mentioned in his blog post, by coupling this API with the OMS View Designer (and even throwing Power BI into the mix), you can develop some really fancy solutions.

On 21st of September (3 weeks from now), I will be presenting at the Melbourne Microsoft Cloud and Datacenter Meetup (https://www.meetup.com/Melbourne-Microsoft-Cloud-and-Datacenter-Meetup/events/233154212/), my topic is Developing Your OWN Custom OMS Solutions. I will doing live demos creating solutions using the HTTP Data Collector API as well as the Custom View Designer. If you are from Melbourne, I encourage you to attend. I am also planning to record this session and publish it on YouTube later.

Lastly, if you have any suggestions for this PowerShell module, please feel free to contact me!

ConfigMgr OMS Connector

Written by Tao Yang

Earlier this week, Microsoft has release a new feature  in System Center Configuration Manager 1606 called OMS Connector:

image

As we all know, OMS supports computer groups. We can either manually create computer groups in OMS using OMS search queries, or import AD and WSUS groups. With the ConfigMgr OMS Connector, we can now import ConfigMgr device collections into OMS as computer groups.

Instead of using the OMS workspace ID and keys to access OMS, the ConfigMgr OMS connector requires an Azure AD Application and Service Principal. My friend and fellow Cloud and Data Center Management MVP Steve Beaumont has blogged his setup experience few days ago. You can read Steve’s post here: http://www.poweronplatforms.com/configmgr-1606-oms-connector/.  As you can see from Steve’s post, provisioning the Azure AD application for the connector can be pretty complex if you are doing it manually – it contains too many steps and you have to use both the old Azure portal (https://manage.windowsazure.com) and the new Azure Portal (https://portal.azure.com).

To simplify the process, I have created a PowerShell script to create the Azure AD application for the ConfigMgr OMS Connector. The script is located in my GitHub repository: https://github.com/tyconsulting/BlogPosts/tree/master/OMS

In order to run this script, you will need the following:

  • The latest version of the AzureRM.Profile and AzureRM.Resources PowerShell module
  • An Azure subscription admin account from the Azure Active Directory that your Azure Subscription is associated to (the UPN must match the AAD directory name)

When you launch the script, you will firstly be prompted to login to Azure:

image

Once you have logged in, you will be prompted to select the Azure Subscription and then specify a display name for the Azure AD application. If you don’t assign a name, the script will try to create the Azure AD application under the name “ConfigMgr-OMS-Connector”:

SNAGHTMLc560723

This script creates the AAD application and assign it Contributor role to your subscription:

image

At the end of the script, you will see the 3 pieces of information you need to create the OMS connector:

  • Tenant
  • Client ID
  • Client Secret Key

You can simply copy and paste these to the OMS connector configuration.

Once you have configured the connector in ConfigMgr and enabled SCCM as a group source, you will soon start seeing the collection memberships being populated in OMS. You can search them in OMS using a search query such as “Type=ComputerGroup GroupSource=SCCM”:

image

Based on what I see, the connector runs every 6 hours and any membership additions or deletions will be updated when the connector runs.

i.e. If I search for a particular collection based on the last 6 hours, I can see this particular collection has 9 members:

image

During my testing, I deleted 2 computers from this collection few days ago. If I specify a custom range targeting a 6-hour time window from few days ago, I can see this collection had 11 members back then:

image

This could be useful sometimes when you need to track down if certain computers have been placed into a collection in the past.

This is all I have to share today. Until next time, enjoy OMS Smile.

SharePointSDK PowerShell Module Updated to Version 2.1.0

Written by Tao Yang

OK, this blog has been very quiet recently. Due to some work related requirements, I had to pass few Microsoft exams. so I have spent most of my time over the last couple of months on study. Firstly, I passed the MCSE Private Cloud Re-Certification exam, then I passed the 2 Azure exams: 70-532 Developing Microsoft Azure Solutions and 70-533 Implementing Microsoft Azure Infrastructure Solutions. Other than studying and taking exams, I have also been working on a new version of the SharePointSDK PowerShell module during my spare time. I have finished everything on my to-do list for this release last night, and I’ve just published version 2.1.0 on PowerShell Gallery and GitHub:

This new release includes the following updates:

01. Fixed the “format-default : The collection has not been initialized.” error when retrieving various SharePoint objects.

i.e. When retrieving the SharePoint list in previous versions using Get-SPList function, you will get this error:

image

This error is fixed in version 2.1.0. now you will get a default view defined in the module:

image

02. SharePoint client SDK DLLs are now automatically loaded with the module.

I have configured the module manifest to load the SharePoint Client SDK DLLs that are included in the module folder. As the result of this change, the Import-SPClientSDK function is no longer required and has been removed from the module completely.

In the past, the Import-SPClientSDK function will firstly try to load required DLLs from the Global Assembly Cache (GAC) and will only fall back to the DLLs located in the module folder if they don’t exist in GAC. Since the Import-SPClientSDK function has been removed, this behaviour is changed in this release. Starting from this release, the module will not try to load the DLLs from GAC, but ALWAYS use the copies in the module folder.

03. New-SPListLookupField function now supports adding additional lookup columns.

When adding a lookup field in a SharePoint list, you can specify including one or more additional columns. i.e.:

image

The previous versions of this module did not support adding additional columns when creating a lookup field. In this version, you are able to add additional columns using the “-AdditionalSourceFields” parameter to achieve this goal.

04. Various minor bug fixes

Other than above mentioned updates, this version also included various minor bug fixes.

Special Thanks

I’d like to thank my friend and fellow CDM MVP Jakob Gottlieb Svendsen (@JakobGSvendsen) for his feedback. Most of the items updated in this release were results of Jakob’s feedbacks.

HybridWorkerToolkit PowerShell Module Updated to Version 1.0.3

Written by Tao Yang

Few days ago, I published a PowerShell Module to be used on Azure Automation Hybrid Workers called HybridWorkerToolkit. You can find my blog article HERE.

Yesterday, my good friend and fellow CDM MVP Daniele Grandini (@DanieleGrandini) gave me some feedback, so I’ve updated the module again and incorporated Daniele’s suggestions.

This is the list of updates in this release:

  • A new array parameter for New-HybridWorkerEventEntry called “-AdditionalParameters”. This parameter allows users to insert an array of additional parameters to be added in the event data:

SNAGHTMLb6e7547

  • A new Boolean parameter for New-HybridWorkerEventEntry called “-LogMinimum”. This is an optional parameter with the default value of $false. When this parameter is set to true, other than the user specified messages and additional parameters, only the Azure Automation Job Id will be logged as event data:

image

As we all know, we pay for the amount of data gets injected into our OMS workspace, this parameter allows you to minimise the size of your events (thus saves money on your OMS spending).

I have published this new release to both GitHub and PowerShell Gallery.

New PowerShell Module HybridWorkerToolkit

Written by Tao Yang

HybridWorkerToolkit23/04/2016 Update: released version 1.0.3 to GitHub and PowerShell gallery. New additions documented in this blog post.

21/04/2016 Update: updated GitHub and PowerShell gallery and released version 1.0.2 with minor bug fix and updated help file.

Introduction

Over the last few days, I have been working on a PowerShell module for Azure Automation Hybrid Workers. I named this module HybridWorkerToolkit.

This module is designed to run within either a PowerShell runbook or a PowerShell workflow runbook on Azure Automation Hybrid Workers. It provides few functions that can be called within the runbook. These activities can assist gathering information about Hybrid Workers and the runbook runtime environment. It also provides a function to log structured events to the Hybrid Workers Windows Event Logs.

My good friend and fellow MVP Pete Zerger posted a method he developed to use Windows event logs and OMS as a centralised logging solution for Azure Automation runbooks when executed on Hybrid Workers. Pete was using the PowerShell cmdlet Write-EventLog to log runbook related activities to Windows event log and then these events will be picked up by OMS. Log Analytics. This is a very innovative way of using Windows event logs and OMS. However, the event log entries written by Write-EventLog are not structured are lacking basic information about your environment and the job runtime.  Couple of weeks ago, another friend of mine, Mr. Kevin Holman from Microsoft also published a PS script that he used to write to Windows event logs with additional parameters.

So I combined Pete’s idea with Kevin’s script, as well as some code I’ve written in the past for Hybrid Workers, and developed this module.

Why do we want to use Windows Event logs combined with OMS for logging runbook activities on Hybrid workers? As Pete explained on this post, it provides a centralised solution where you can query and retrieve these activity logs for all your runbooks from a single location. Additionally, based on my experience (and also confirmed with few other friends), is that when you use Write-Verbose or Write-Output in your runbook and enabled verbose logging, the runbook execution time can increase significantly, especially when loading a module with a lot of activities. Based on my own experience, I’ve seen a runbook that would normally takes a minute or two to run with verbose logging turned off ended up ran over half an hour after I enabled verbose logging. This is another reason I’ve developed this module so it gives you an alternative option to log verbose, error, process and output messages.

Functions

This module provides the following 3 functions:

  • Get-HybridWorkerConfiguration
  • Get-HybridWorkerJobRuntimeInfo
  • New-HybridWorkerRunbookLogEntry

Note: Although the job runtime are different between PowerShell runbooks and PowerShell Workflow runbooks, I have spent a lot of time together with Pete making sure we can use these activities exactly the same ways between PowerShell and PowerShell workflow runbooks.

Get-HybridWorkerConfiguration

This function can be used to get the Hybrid Worker and Microsoft Monitoring Agent configuration. A hash table is returned the following configuration properties retrieved from Hybrid Worker and MMA agent:

  • Hybrid Worker Group name
  • Automation Account Id
  • Machine Id
  • Computer Name
  • MMA install root
  • PowerShell version
  • Hybrid Worker version
  • System-wide Proxy server address
  • MMA version
  • MMA Proxy URL
  • MMA Proxy user name
  • MMA connected OMS workspace Id

Get-HybridWorkerJobRuntimeInfo

This function retrieves the following information about the Azure Automation runbook and the job run time. They are returned in a hashtable:

  • Runbook job ID
  • Sandbox Id
  • Process Id
  • Automation Asset End Point
  • PSModulePath environment variable
  • Current User name
  • Log Activity Trace
  • Current Working Directory
  • Runbook type
  • Runbook name
  • Azure Automation account name
  • Azure Resource Group name
  • Azure subscription Id
  • Time taken to start runbook in seconds

New-HybridWorkerRunbookLogEntry

This function can be used to log event log entries. By default, other than the event message itself, the following information is also logged as part of the event (placed under the <EventData> XML tag:

  • Azure Automation Account Name
  • Hybrid Worker Group Name
  • Azure Automation Account Resource Group Name
  • Azure Subscription Id
  • Azure Automation Job Id
  • Sandbox Id
  • Process Id
  • Current Working Directory ($PWD)
  • Runbook Type
  • Runbook Name
  • Time Taken To Start Running in Seconds

This function also has an optional Boolean parameter called ‘-LogHybridWorkerConfig’ When this parameter is set to $true, the event created by this function will also contain the following information about the Hybrid Worker and MMA:

  • Hybrid Worker Version
  • Microsoft Monitoring Agent Version
  • Microsoft Monitoring Agent Install Path
  • Microsoft Monitoring Agent Proxy URL
  • Hybrid Worker server System-wide Proxy server address
  • Microsoft OMS Workspace ID

Sample Runbooks

Sample PowerShell Runbook:

Sample PowerShell Workflow Runbook

As you can see, the way to call these functions between PowerShell and PowerShell Workflow runbooks are exactly the same.

Hybrid Worker Configuration output:

SNAGHTML40e35ad

Hybrid Worker Job Runtime Info output:

SNAGHTML40f4d28

Event generated (with basic information / without setting –LogHybridWorkerConfig to $true):

SNAGHTML4159a60[4]

Event generated (whensetting –LogHybridWorkerConfig to $true):

SNAGHTML4150515

Consuming collected events in OMS

Once you have collected these events in OMS, you can use search queries to find them, and you can also create OMS alerts to notify you using your preferred methods.

Searching Events in OMS

i.e. I can use this query to get all events logged by a particular runbook:

Type=Event “RunbookName: Test-HybridWorkerOutput-PSW”

image

or use this query to get all events for a particular job:

Type=Event “JobId: 73A3827D-73F8-4ECC-9DE1-B9340FB90744”

image

OMS Alerts

i.e. if I want to create an OMS alert for any Error events logged by New-HybridWorkerRunbookLogEntry, I can use a query like this one:

Type=Event Source=AzureAutomation?Job* EventLevelName=Error

image

image

Download / Deploy this module

I have published this module on Github as well as PowerShell Gallery:

GitHub Repository: https://github.com/tyconsulting/HybridWorkerToolkit

PowerShell Gallery:  http://www.powershellgallery.com/packages/HybridWorkerToolkit/1.0.3

Credit

I’d like to thank Pete and Kevin for the ideas in the first place, also I’d like to thank Pete, Jakob Svendsen, Daniele Grandini and Kieran Jacobsen for the testing and feedback!

A Major Update for the SharePointSDK PS Module

Written by Tao Yang

Sharepoint-2013-LogoIntroduction

This blog has been a bit quiet over the last few weeks. This is because I have been really really busy. I have spent a lot of time working on an updated version of the SharePointSDK PS module. Just in case you have not played with this module, here’s some background info:

Just over a year ago, I posted a PowerShell / SMA / Azure Automation module on this blog called SharePointSDK. Few months ago, I have also published this module on Github and PowerShell Gallery. This module was designed to help automate operations around SharePoint lists (i.e. CRUD operations for SharePoint list items). Coupling SharePoint (both On-prem version or SharePoint Online) with Azure Automation (or even SMA) is becoming more and more common in the community when designing automation solutions. This module provides ways for your automation runbooks to interact with SharePoint list items.

However, I believe the original 1.0 release was really basic, and there are still a lot I’d like to cover in this module. Now I’m pleased to announce the new major release (version 2.0.1) is now available on both Github and PowerShell Gallery.

What’s New?

I’ve included the following updates in version 2.0.1:

  • 26 additional functions!
  • Updated the SharePoint CSOM (Client Component SDK) DLLs to the latest version in the module.
  • Created a separate help file for the module. Get-Help is now fully working
  • Various bug fixes

The table below lists all the functions that are shipped in the current release (version 2.0.1):

Function Description Released on Version
Import-SPClientSDK Load SharePoint Client SDK DLLs 1.0
New-SPCredential Create a SharePoint credential that can be used authenticating to a SharePoint (online or On-Premise) site. 1.0
Get-SPServerVersion Get SharePoint server version. 1.0
Get-SPListFields Get all fields from a list on a SharePoint site. 1.0
Add-SPListItem Add a list item to the SharePoint site. 1.0
Get-SPListItem Get all items from a list on a SharePoint site or a specific item by specifying the List Item ID. 1.0
Remove-SPListItem Delete a list item to the SharePoint site. 1.0
Update-SPListItem Update a list item to the SharePoint site. 1.0
Get-SPListItemAttachments Download all attachments from a SharePoint list item. 1.0
Add-SPListItemAttachment Upload a file as a SharePoint list item attachment. 1.0
Remove-SPListItemAttachment Remove a SharePoint list item attachment. 1.0
New-SPList Create a new list on the SharePoint site. 2.1
Remove-SPList Remove a list from the SharePoint site. 2.1
Get-SPList Get a list from the SharePoint site. 2.1
New-SPListLookupField Create a new lookup Field for a SharePoint list. 2.1
New-SPListCheckboxField Create a new checkbox Field for a SharePoint list. 2.1
New-SPListSingleLineTextField Create a new single line text Field for a SharePoint list. 2.1
New-SPListMultiLineTextField Create a new Multi-line text Field for a SharePoint list. 2.1
New-SPListNumberField Create a new number Field for a SharePoint list. 2.1
New-SPListChoiceField Create a new choice Field for a SharePoint list. 2.1
New-SPListDateTimeField Create a new date time Field for a SharePoint list. 2.1
New-SPListHyperLinkField Create a new Hyperlink or Picture Field for a SharePoint list. 2.1
New-SPListPersonField Create a new Person or Group Field for a SharePoint list. 2.1
Remove-SPListField Remove a Field from a SharePoint list. 2.1
Update-SPListField Update a SharePoint list field. 2.1
Set-SPListFieldVisibility Set the visibility of a SharePoint list field. 2.1
Get-SPGroup Get a single group or all groups from the SharePoint site. 2.1
New-SPGroup Create a new SharePoint group. 2.1
New-SPGroupMember Add an user to a SharePoint group. 2.1
Remove-SPGroupMember Remove an user from a SharePoint group. 2.1
Clear-SPSiteRecycleBin Empty the SharePoint site recycle bin. 2.1
Get-SPSiteTemplate Get avaialble Site Template(s) from the SharePoint site. 2.1
New-SPSubSite Create a new SharePoint sub site. 2.1
Get-SPSubSite Get all SharePoint sub sites from a root site. 2.1
Remove-SPSubSite Delete a SharePoint sub site. 2.1
Add-SPListFieldToDefaultView Add a SharePoint list field to the list default view. 2.1
Remove-SPListFieldFromDefaultView Remove a SharePoint list field to the list default view. 2.1

As you can see, the previous version has shipped 11 functions, and 26 additional functions have been added to the current release (2.0.1). With this release, other than the SharePoint list items, we are also able to manage SharePoint lists, list fields, groups, group members, and even subsites. I have included functions to create what I believe the most common list fields (as highlighted below):

image

Future Plans

At this stage, there are still few things I’d like to cover in this module but I simply do not have time. Since I think I have reached another milestone at this stage, I have decided to release this version now and roll other ideas into the future release.

In the second week of March, I will be presenting at SCU APAC (Kuala Lumpur, Malaysia) and Australia (Melbourne).  I am presenting 2 identical sessions at both locations:

  • Be a hero and save the day with OMS and Power BI (Co-present with CDM MVP Alex Verkinderen)
  • Automation for IT Ops with OMS and Azure Automation (Co-present with CDM MVP Pete Zerger)

As part of the demos I have prepared for the Azure Automation session with Pete, I will cover how I’m using this module as part of my automation solutions.

After SCU, I am planning to write another blog post for my Automating OpsMgr series which will cover one of the our SCU demos (I know, it has been a long time since my last post for that series). I will also cover this module in more details in this upcoming blog post.

Download the Module

So for now, if you’d like to give this module a try, you can find it from both GitHub and PowerShell Gallery. All functions are fully documented in the help file. You can access the help document as well as code examples using Get-Help with –Full switch.

Lastly, if you have any feedback, or suggestions for future releases, please feel free to drop me an email.

This is all I have to share for today, until next time, happy automating Smile.

Automating OpsLogix Oracle MP Configuration

Written by Tao Yang

Introduction

One of the flagship management packs from OpsLogix is the Oracle Database MP. This MP provides several GUI driven wizard to help you creating your own monitoring solutions for Oracle by leveraging the OpsMgr management pack templates (https://technet.microsoft.com/en-au/library/hh457614.aspx). At this stage, the OpsLogix Oracle MP provides the following templates:

01. Oracle Alert Rule template

This template allows you to create a rule that checks a value from your oracle environment and generate alerts in the event that the value is detected or missing, depending on the configuration you have specified.

02. Oracle Performance Collection Rule template

This template allows you to create a rule that will collect performance data from your Oracle environment in order to visualize data on the performance view and reports.

03. Oracle Two-State Monitor Template

This template allows you to create a monitor that will check the health of an element according to the configuration that you have specified in the wizard. It will generate alerts when the monitor becomes unhealthy.

Like any other OpsMgr management pack templates, the above mentioned templates can be found in the Authoring pane of the OpsMgr console, under “Management Pack Templates”:

image

Some Background on Management Pack Templates

The MP templates provide great ways for users to create complex monitoring scenarios without having to use MP authoring tools such as VSAE or Silect MPAuthor. The MP templates are designed to satisfy specific monitoring needs (i.e. Windows service monitoring, TCP Port monitoring etc.). From an OpsMgr admin and operator point of view, they are great, because each template provides a user friendly GUI driven wizard for you to create your monitoring solutions.

From a MP developer point of view, these templates are not easy to create – not only because you need to define the templates in the MP, but most of time, you also need to design the UI pages to be used in the wizard, which is very time consuming (not to mention these UI pages are written in C#). I have done it several times, and believe me, they are not easy! So every time when I see a MP offers management pack templates, I really appreciate the effort put in by the developers.

Although I think the management pack templates provides a user friendly GUI driven wizard for users to create their monitoring solutions, in my opinion, the biggest drawback is also the GUI wizard. It means you HAVE TO use the GUI wizard – it may become an issue when you have a lot of stuff to configure.

Let me give you an example based on my own experience. A few months ago, I was away attending a conference overseas and a customer needed to create hundreds of instances for the Windows Service monitoring template. Because they didn’t want to wait for my return, I was told someone spent a few days clicking through the wizard many, many times.

So what other options do we have? Fortunately, the management pack template instances can be created via OpsMgr SDK.

Automating MP Template Instance Creation

If you have been following my blog series “Automating OpsMgr”, you may have already read Part 17 of this series: Creating Windows Service Management Pack Template Instance, where I demonstrated a runbook leveraging the OpsMgrExtended PowerShell module and enabled people to create a management pack template instance (in this case, the Windows Service template) using one line of PowerShell script. This was a great example on how to create the template instances in mass scales.

OK, let’s go back to the OpsLogix Oracle MP… Just to put it out there, my experience with Oracle DB is very limited. Throughout the years I spent in IT, I’ve only been dealing with Microsoft’s SQL servers. Based on my experience with SQL, I know that every DBA will have a set of queries they regularly use to monitor their SQL environments. I assume this is also the case for Oracle. So, one of the first concerns I had when I started playing with this MP is, creating user defined monitoring scenarios could be very time consuming when using the management pack template wizards. Therefore, I spent few hours today, and produced 3 separate PowerShell functions that people can use to create instances for the 3 templates mentioned above. These functions are:

  1. New-OpsLogixOracleAlertTemplateInstance
  2. New-OpsLogixOraclePerfTemplateInstance
  3. New-OpsLogixOracle2StateMonitorTemplateInstance

Pre-requisites:

These functions requires the OpsMgrExtended Module on the computer where you are running the script. Please follow the instruction and setup this module first.

Download Link:

I have uploaded the code for above mentioned PowerShell functions to Github. You can download them from https://github.com/tyconsulting/OpsMgr-SDK-Scripts/tree/master/OpsLogix%20Oracle%20MP%20Scripts

Now, let’s test them, I will use the –verbose switch when calling these functions so you can see the verbose messages.

01. Creating a test MP

Firstly, I’ll create a test MP using the New-OMManagementPack command from the OpsMgrExtended module:

image

02. Create an instance for the alert rule template (using PowerShell Splatting)

Calling the New-OpsLogixOracleAlertTemplateInstance function:

image

SNAGHTML7af883b

03. Create an instance for the performance collection template

Calling the New-OpsLogixOraclePerfTemplateInstance function:

image

SNAGHTML7b050da

04. Create an instance for the Two-State Monitor template

Calling the New-OpsLogixOracle2StateMonitorTemplateInstance function:

image

SNAGHTML7b1236d

Note: There is also a test.ps1 script in this Github repository. It contains the test parameters used as shown in the screenshots above.

Conclusion

As you may have noticed, these functions also have a parameter set to support the SMA / Azure Automation connection object (defined in the OpsMgrExtended Module). If you are planning to make this part of your automation solution, you can simply change this from a PowerShell function to a runbook and use the –SDKConnection parameter to establish connection to the management group. this should be very straightforward; you can refer to my previous post on the Automating OpsMgr blog series for more details.

I hope these functions will help customers who are deploying Oracle monitoring solutions using OpsLogix Oracle MP. For example, if you need to create a lot of these instances, you can create a CSV file with all the required parameters and values, and then create a very simple PowerShell script to read the CSV file and then call the appropriate functions. I’ve done the hard work for you, the rest should be pretty easy  Smile.

Lastly, if anyone would like to evaluate the OpsLogix Oracle MP, they can be contacted via email sales@opslogix.com