Category Archives: PowerShell

PowerShell Module for Managing Azure Table Storage Entities

Written by Tao Yang

Azure Storage - TableIntroduction

Firstly, apologies for not being able to blog for 6 weeks. I have been really busy lately.  As part of a project that I’m working on, I have been dealing with Azure Table storage and its REST API over the last couple of weeks. I have written few Azure Function app in C# as well as some Azure Automation runbooks in PowerShell that involves inserting, querying and updating records (entities) in Azure tables. I was struggling a little bit during development of these function apps and runbooks because I couldn’t find too many good code examples and I personally believe this REST API is not well documented on Microsoft’s documentation site (https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/table-service-rest-api). Therefore I have spent the last two days developed a PowerShell module for managing the lifecycle of the Azure Table entities. This module can be used to perform CRUD (Create, Read, Update, Delete) operations for Azure Table entities.

AzureTableEntity PowerShell Module

This PowerShell module is named as AzureTableEntity, it can be located in both GitHub and PowerShell Gallery:

This module offers the following 4 functions:

Get-AzureTableEntity Search Azure Table entities by specifying a search string.
New-AzureTableEntity Insert one or more entities to Azure table storage.
Update-AzureTableEntity Update one or more entities to Azure table storage.
Remove-AzureTableEntity Remove one or more entities to Azure table storage.

Note: All functions have been properly documented in the help file. you can use Get-Help cmdlet to access the help file.

Get-AzureTableEntity

By default when performing query operation, the Azure Table REST API only returns up to 1000 entities or all entities returned from search within 5 seconds. This function has a parameter ‘-GetAll’ that can be used to return all search results from a large table. The default value of this parameter is set to $true.

The search result returned by the search API is deserialised. As the result, complex data type such as datetime is returned as string. If you want any datetime fields from the search result returned as original datetime field, you can set the “-ConvertDateTimeFields” parameter to $true. Please note this would potentially increase the script execution time when dealing with a large set of search result.

Hint: You can easily build your search string using the Azure Storage Explorer.

New-AzureTableEntity

This function can be used to insert a single entity or bulk insert up to 100 entities (and the total payload size is less than 4MB).

Please make sure both “PartitionKey” and “RowKey” are included in the entity. The data type for these fields must be string.

i.e. Instead of setting RowKey = 1, you should set RowKey = “1” – because the value for both PartitionKey and RowKey must be a string.

Update-AzureTableEntity

This function can be used to update a single entity or bulk update up to 100 entities (and the total payload size is less than 4MB).

Please note when updating an entity, all fields (including the fields that do not need to be updated) must be specified. It is actually a merge operation. If you are modifying an existing entity returned from the search operation (Get-AzureTableEntity) and the entity contains datetime fields, please make sure you set “-ConvertDateTimeFields” parameter to $true when performing the search in the first place. Please also be aware that the built-in Timestamp field must not be included in the entity fields.

Remove-AzureTableEntity

This function can be used to remove a single entity or bulk remove up to 100 entities (and the total payload size is less than 4MB).

Support for Azure Automation and SMA

To simply leveraging this module in Azure Automation or SMA, I have included a connection object in the module:

image

Once you have created the connection objects, instead of specifying storage account, table name and storage account access key, you can simply specify the connection object using ‘-TableConnection’ parameter for all four functions.

Sample Code

I have published some sample code I wrote when developing this module to GitHub Gist:

Summary

I wrote this module so I can simplify my Azure Automation runbooks and make IT Pro’s life easier when working on Azure Table storage. If you have to deal with Azure Table storage, I hope you find this module useful. If you are a developer and looking for code samples, you can still use this module and simply translate the code to the language of your choice.

I purposely didn’t include any functions for managing the Azure table storage itself because you can manage the Table storage using the Azure.Storage module.

Lastly, feedbacks are always welcome, so please drop me an email if you have any.

Securing Passwords in Azure Functions

Written by Tao Yang

09/10/2016 – Note: This post has been updated as per David O’Brien’s suggestion .

As I mentioned in my last post, I have started playing with Azure Functions few weeks ago and I’ve already built few pretty cool solutions. One thing that I’ve spent a lot of time doing research on is how to secure credentials in Azure Functions.

Obviously, Azure Key Vault would be an ideal candidate for storing credentials for Azure services. If I’m using another automation product that I’m quite familiar with – Azure Automation, I’d certainly go down the Key Vault path because Since Azure Automation account already creates a Service Principal for logging into Azure and we can simply grant the Azure AD Application access to the Key Vault. However, and please do point me to the correct direction if I’m wrong, I don’t think there is an easy way to access the Key Vault from Azure Functions at this stage.

I cam across 2 feature requests on both Github and UserVoice suggesting a way to access Key Vault from Azure Functions, so I hope this capability will be added at later stage. But for now, I’ve come up a simple way to encrypt the password in the Azure Functions code so it is not stored in clear text. I purposely want to keep the solution as simple as possible because one of the big advantage of using Azure Functions is being really quick, therefore I believe the less code I have to write the better. I’ll use a PowerShell example to explain what I have done.

I needed to write a function to retrieve Azure VMs from a subscription – and I’ll blog the complete solution next time. Sticking with the language that I know the best, I’m using PowerShell. I have already explained how to use custom PowerShell modules in my last post. In order to retrieve the Azure VMs information, we need two modules:

  • AzureRM.Profile
  • AzureRM.Compute

I use the method explained in the previous post and uploaded the two modules to the function folder. Obviously, I also need to use a credential to sign in to my Azure subscription before retrieving the Azure VM information.

I’m using a key (a byte array) to encrypt the password secure string.  If you are not familiar with this practice, I found a very detailed 2-part blog post on this topic, you can read them here:

Secure Password With PowerShell: Encrypting Credentials – Part 1

Secure Password With PowerShell: Encrypting Credentials – Part 2

So firstly, I’ll need to create a key and store the content to a file:

I then uploaded the key to the Azure Functions folder – I’ve already uploaded the PowerShell modules to the “bin” folder, I created a sub-folder under “bin” called Keys:

image

I wrote a little PowerShell function (that runs on my PC, where a copy of the key file is stored) to encrypt the password.

PowerShell function Get-EncryptedPassword:

I call this function to encrypt the password and copy the encrypted string to the clipboard:

image

I then created two app settings in Azure Functions Application settings:

  • AzureCredUserName
  • AzureCredPassword

The AzureCredUserName has the value of the user name of the service account and AzureCredPassword is the encrypted string that we prepared in the previous step.

image

I then paste the encrypted password string to my Azure Functions code (line 24):

The app settings are exposed to the Azure functions as environment variables, so we can reference them in the script as $env:AzureCredUserName and $env:AzureCredPassword (line 23 and 24)

image

As shown above, to decrypt the password from the encrypted string to the SecureString, the PowerShell code reads the content of the key file and use it as the key to convert the encrypted password to the SecureString (line 26-27). After the password has been converted to the SecureString, we can then create a PSCredential object and use it to login to Azure (line 28-29).

Note: If you read my last post, I have explained how to use Kudu console to find the absolute path of a file, so in this case, the file path of the key file is specified on line 26.

Needless to say, the key file you’ve created must be stored securely. For example, I’m using KeePass to store my passwords, and I’m storing this file in KeePass. Do not leave it in an unsecured location (such as C:\temp as I demonstrated in this example).

Also, Since the app settings apply to all functions in your Azure Functions account, you may consider using different encryption keys in different functions if you want to limit which which function can access a particular encrypted password.

Lastly, as I stated earlier, I wanted to keep the solution as simple as possible. If you know better ways to secure passwords, please do contact me and I’d like to learn from you.

Pushing PowerShell Modules From PowerShell Gallery to Your MyGet Feeds Directly

Written by Tao Yang

PSGallery-MyGet

Recently I have started using a private MyGet feed and my cPowerShellPackageManagement DSC Resource module to manage PowerShell modules on my lab servers.

When new modules are released in PowerShell Gallery (i.e. all the Azure modules), I’d normally use Install-Module to install on test machines, then publish the tested modules to my MyGet feed and then my servers would pick up the new modules.

Although I can use public-module cmdlet to upload the module located locally on my PC to MyGet feed, it can be really time consuming when the module sizes are big (i.e. some of the Azure modules). It only took me few minutes to figure out how do I push modules directly from PowerShell Gallery (or any NuGet feeds) to my MyGet feed.

To configure it, Under the MyGet feed, go to “Package Sources”, and click “Add package source…”

SNAGHTML6b70b9f

Then choose NuGet feed, fill out name and source

Name: PowerShellGallery

Source: https://www.powershellgallery.com/api/v2/

image

Once added, I can search PowerShell Gallery and add packages directly to MyGet.

image

image

Scripting Azure Automation Module Imports Directly from MyGet or PowerShell Gallery

Written by Tao Yang

There are few ways to add PowerShell modules to Azure Automation accounts:

1. Via the Azure Portal by uploading the module zip file from local computer.

image

2. If the module is located in PowerShell Gallery, you can push it to your Automation Account directly from PowerShell Gallery.

image

3. Use PowerShell cmdlet New-AzureRmAutomationModule from the AzureRM.Automation module.

One of the limitation of using New-AzureRMAutomationModule cmdlet is, the module must be zipped and located somewhere online that Azure has access to. You will need to specify the location by using the –ContentLink parameter. In the past, in order to script the module deployment, even when the module is located in PowerShell Gallery, I had to save the module to a place where my Automation Account has access to (such as an Azure blob storage, or creating a release in a public Github repo).

Tonight, I was writing a script and I wanted to see if I can deploy modules to my Automation Account directly from a package repository of my choice – other than PowerShell Gallery, I also have a private MyGet feed that I use for storing my PowerShell modules.

It turned out to be really easy to do so, only took me few minutes to figure out how. I’ll use a module I wrote in the past called “SendEmail” as an example. It is published in both PowerShell Gallery, and my private MyGet feed.

Importing from PowerShell Gallery

the URL for this module in PowerShell Gallery is: https://www.powershellgallery.com/packages/SendEmail/1.3

The –ContentLink URI that we need to pass to the Add-AzureRmAutomationModule cmdlet would be:

https://www.powershellgallery.com/api/v2/package/SendEmail/1.3.

As you can see, all you need to do is to add “api/v2/” in the URI. The PowerShell command would be something like this:

Importing from a private MyGet feed

For a private MyGet feed, you can access it by embedding the API key into the URL:

image

The URL for my module would be: “http://www.myget.org/F/<Your MyGet feed name>/auth/<MyGet API Key>/api/v2/package/<Module Name>/<Module Version>

i.e. for my SendEmail module, the PowerShell command would be something like this:

Importing from a public MyGet feed

If the module is located in a public MyGet feed, then the API key is not required. the URI for the module would be very similar to PowerShell Gallery, you will just need to embed “api/v2/” in to the original URI:

‘https://www.myget.org/F/<MyGet Public Feed Name>/api/v2/package/<Module Name>/<Module Version>

the PowerShell script would be something like this:

PowerShell DSC Resource for Managing Repositories and Modules

Written by Tao Yang

256x256Introduction

PowerShell version 5 has introduced a new feature that allows you to install packages (such as PowerShell modules) from NuGet repositories. If you have used cmdlets such as Find-Module, Install-Module or Uninstall-Module, then you have already taken advantage of this awesome feature.

By default, a Microsoft owned public repository PowerShell Gallery is configured on all computers running PowerShell version 5 and when you use Find-Module or Install-Module, you are pulling the modules from the PowerShell Gallery.

Ever since I started using PowerShell v5, I’ve discovered some challenges managing modules for machines in my environment:

  • Lack of a fully automated way to push modules to a group of computers
  • Module version inconsistency between computers
  • Need of a private repository

Let me elaborate each of the point listed above.

Lack of a fully automated way to push modules to a group of computers

Back in the old days (pre WMF v5), I used to package PowerShell modules to msi’s and use ConfigMgr to deploy the msi to target computers. although it’s not too hard to a package module to msi, this method is really time consuming, not to mention it also requires ConfigMgr. In PowerShell v5, I can write a script that utilise PowerShell remoting to push modules to remote machines, this is still a manual process, and it may not be a viable solution for a large group of computers.

Module version inconsistency between computers

over the time, modules get updated, new modules get released from various sources. I often find module version become inconsistent among computers. there is no automated ways to update computers when a new version is released.

Need of a private repository

PowerShell Gallery is public. everything you publish to it will be available for the entire world. Organisations often write modules specifically for internal use, and may not want to share it with the rest of the world.

Before I dive into the main topic, I’d like to discuss what I have done for implementing private repositories.

Private Repositories

PowerShell PackageManagement uses NuGet repositories. I found the following solutions available:

MyGet is a SaaS (Software as a Service) based repository hosted on the cloud. Although you can create your own feeds, private feeds come with a price tag (free accounts allow you to create public feeds that everyone can access).

ProGet is a on-premises solution. To install it, you will need a web server (and optionally a SQL server) within your network. It comes with free, basic and enterprise editions. the feature comparison is located here: http://inedo.com/proget/pricing/features-by-edition

Since both MyGet and ProGet offer NFR (Not For Resell) licenses to Microsoft MVPs, I have tested both for my lab environment. They both work pretty well. I did not bother to setup the free private NuGet repository (the 3rd option).

These days, I found myself writing more and more PowerShell modules for different projects. During development phase, I’d normally use a feed that’s hosted on my ProGet server because it is located in my lab, so it’s faster to publish and download modules. Once the module is ready, I’d normally publish it to MyGet for general consumption because it’s a SaaS based application, both my lab machines and Azure IaaS machines will have no problem accessing it.

DSC Resource cPowerShellPackageManagement

In order to overcome the other two challenges that I’m facing (module automatically deployment and version inconsistency), I have created a DSC resource called cPowerShellPackageManagement.

According to the DSC namingstandard, the first letter ‘c’ indicates it is a community resource, and as the rest of the name suggests, it is used to manage PowerShell packages.

This DSC resource module contains 2 resources:

  • cPowerShellRepository – used to register or unregister specific NuGet feeds on computers running PowerShell v5 and above.
  • cPowerShellModuleManagement – used to install / uninstall modules on computers running PowerShell v5 and aove

cPowerShellRepository

Syntax:

To register a feed, you will need to specify some basic information such as PublishLocation and SourceLocation. You can also set Ensure = Absent to unregister the feed with the name specified in the Name parameter.

When not specified, the InstallationPolicy field default value is “Untrusted”. If you’d like to set the repository as a trusted repository, set this value to “Trusted”.

Note: since the repository registration is based on each user (as opposed to machine based settings) and DSC configuration is executed under LocalSystem context. you will not be able to see the repository added by this resource if you run Get-PSRepository cmdlet under your own user account. If you start PowerShell under LocalSystem by using PsExec (run psexec /i /s /d powershell.exe), you will be able to see the repository:

image

cPowerShellModuleManagement

Syntax:

  • PSModuleName – PowerShell module name. When this is set to ‘all’, all modules from the specified repository will be installed. So please do not use ‘all’ against PSGallery!!
  • RepositoryName – Name of the repository where module will be installed from. This can be a public repository such as PowerShell Gallery, or your privately owned repository (i.e. your ProGet or MyGet feeds). You can use the cPowerShellRepository resource to configure the repository.
  • PSModuleVersion – This is an optional field. when used, only the specified version will be installed (or uninstalled). If not specified, the latest version of the module from the repository will be used. This field will not impact other versions that are already installed on the computer (i.e. when installing the latest version, earlier versions will not be uninstalled).
  • MaintenanceStartHour, MaintenanceStartMinute and MaintenanceLengthMinute – Since the LCM will run the DSC configuration on a pre-configured interval, you may not want to install / uninstall modules during business hours. Therefore, you can set the maintenance start hour (0-23) and start minute (0-59) to specify the start time of the maintenance window. MaintenanceLengthMinute represents the length of the maintenance window in minutes. These fields are optional, when specified, module installation and uninstallation will only take place when the LCM runs the configuration within the maintenance window. Note: Please make sure the MaintenanceLengthMinute is greater than the value configured for the LCM ConfigurationModeFrequencyMins property.

image

Sample Configuration

Here are some sample configurations to demonstrate the usage of these DSC resources.

1. Register to an On-Prem ProGet feed and install all modules from the feed

Using this configuration, I can manage the modules from the repository feed level. if I add or update a module to the feed, the DSC LCM on each configured compute will automatically install the newly added (or updated) module when next time the configuration is refreshed.

2. Register to a feed hosted on MyGet, and install several specific modules

In this example, I’ve specified a particular module can be installed at any time (the Gac module), and another module can only be installed (or updated) at a specific time window (the SharePointSDK module).

Download and Install Locations

This DSC Resource has been published to PowerShellGallery: https://www.powershellgallery.com/packages/cPowerShellPackageManagement

The project is also located on Github: https://github.com/tyconsulting/PowerShellPackageManagementDSCResource

Special Thanks

I’d like to thank my MVP friends Jakob G Svendsen (@JakobGSvendsen), Pete Zerger (@pzerger), Daniele Grandini (@DanieleGrandini) and James Bannan (@JamesBannan) who provided feedback and helped me testing the modules.

PowerShell Module for OMS HTTP Data Collector API

Written by Tao Yang

Background

Earlier today, the OMS Product Group has released the OMS HTTP Data Collection API to public preview. If you haven’t read the announcement, you can read this blog post written by the PM of this feature, Evan Hissey first.

As a Cloud and Datacenter Management MVP, I’ve had private preview access to this feature for few months now, and I actually even developed a solution using this API in a customer engagement with my friend and fellow CDM MVP Alex Verkinderen (@AlexVerkinderen) just over a month ago. I was really impressed with the potential opportunities this feature may bring to us, I’ve been spamming Evan’s inbox asking him for the release date of this feature so I can blog about it and also present this in user group meetups.

Since most of us wouldn’t like having to deal with HTTP headers, bodies, authorizations and other overhead we have to put into our code in order to use this API, I have developed a PowerShell module to help us easily utilize this API.

Introducing OMSDataInjection PowerShell Module

This module was developed about 2 months ago, I was waiting for the API to become public so I can release this module. So now the wait is over, I can finally release it.

This module contains only one public function: New-OMSDataInjection. This function is well documented in a proper help file. you can access it via Get-Help New-OMSDataInjection –Full. I have added 2 examples in the help file too:

————————– EXAMPLE 1 ————————–

PS C:\>$PrimaryKey = Read-Host -Prompt ‘Enter the primary key’
$ObjProperties = @{
Computer = $env:COMPUTERNAME
Username = $env:USERNAME
Message  = ‘This is a test message injected by the OMSDataInjection module. Input data type: PSObject’
LogTime  = [Datetime]::UtcNow
}
$OMSDataObject = New-Object -TypeName PSObject -Property $ObjProperties
$InjectData = New-OMSDataInjection -OMSWorkSpaceId ‘8eb61d08-133c-401a-a45b-0e611194779f’ -PrimaryKey $PrimaryKey -LogType ‘OMSTestData’ -UTCTimeStampField ‘LogTime’ -OMSDataObject $OMSDataObject

Injecting data using a PS object by specifying the OMS workspace Id and primary key
————————– EXAMPLE 2 ————————–

PS C:\>$OMSConnection = Get-AutomationConnection ‘OMSConnection’
$OMSDataJSON = @”
{
“Username”:  “administrator”,
“Message”:  “This is a test message injected by the OMSDataInjection module. Input data type: JSON”,
“LogTime”:  “Tuesday, 28 June 2016 9:08:15 PM”,
“Computer”:  “SERVER01”
}
“@
$InjectData = New-OMSDataInjection -OMSConnection $OMSConnection -LogType ‘OMSTestData’ -UTCTimeStampField ‘LogTime’ -OMSDataJSON $OMSDataJSON

Injecting data using JSON formatted string by specifying the OMSWorkspace Azure Automation / SMA connection object (to be used in a runbook)

This PS module comes with the following features:

01. A Connection object for using this module in Azure Automation and SMA.

Once imported into your Azure Automation account (or SMA for the ‘old skool’ folks), you will be able to create connection objects that contains your OMS workspace Id, primary key and secondary key (optional):

image

And as shown in Example 2 listed above, in your runbook, you can retrieve this connection object and use it when calling the New-OMSDataInjection function.

02. Fall back to the secondary key if the primary key has failed

When the optional secondary key is specified, if the web request using the primary key fails, the module will fall back to the secondary key and try the web request again using the secondary key. This is to ensure your script / automation runbooks will not be interrupted when you are in the process of  following the best practice and cycling through your keys.

03. Supports two types of input: JSON and PSObject

As you can see from Evan’s post, this API is expecting a JSON object as the HTTP body which contains the data to be injected into OMS. When I started testing this API few months ago, my good friend and fellow MVP Stanislav Zhelyazkov (@StanZhelyazkov) suggested me instead of writing plain JSON format, it’s better to put everything into a PSObject then convert it to JSON in PowerShell so we don’t mess with the format and type of each field. I think it was a good idea, so I have coded the module to take either JSON format, or a PSObject that contains the data to be injected into OMS.

Sample Script  and Runbook

I’ve created a sample script and a runbook to help you get started. They are also included in the Github repository for this module (link at the bottom of this article):

Sample Script: Test-OMSDataInjection.ps1

Sample Runbook: Test-OMSDataInjectionRunbook

Exploring Data in OMS

Once the data is injected into OMS, if you are using a new data type,  it can take a while (few hours) for all the fields to be available in OMS.

i.e. the data injected by the sample script and Azure Automation runbook (executed on Azure):

image

all the fields that you have defined are stored as custom fields in your OMS workspace:

image

Please keep in mind, since the Custom Fields feature is still at the preview phase, there’s a limit of 100 custom fields per workspace at this stage (https://azure.microsoft.com/en-us/documentation/articles/log-analytics-custom-fields/), so please be mindful of this limitation when you are building your custom solutions using the HTTP Data Collector API.

Where to Download This Module?

I have published this module in PowerShell Gallery: https://www.powershellgallery.com/packages/OMSDataInjection, if you are using PowerShell version 5 and above, you can install it directly from it: Install-Module –Name OMSDataInjection –Repository PSGallery

You can also download it from it’s GitHub repo: https://github.com/tyconsulting/OMSDataInjection-PSModule/releases

Summary

In the past, we’ve had the OMS Custom View Designer that can help us visualising the data that we already have in OMS Log Analytics, what we were missing is a native way to inject data into OMS. Now with the release of this API, the gap has been filled. Like Evan mentioned in his blog post, by coupling this API with the OMS View Designer (and even throwing Power BI into the mix), you can develop some really fancy solutions.

On 21st of September (3 weeks from now), I will be presenting at the Melbourne Microsoft Cloud and Datacenter Meetup (https://www.meetup.com/Melbourne-Microsoft-Cloud-and-Datacenter-Meetup/events/233154212/), my topic is Developing Your OWN Custom OMS Solutions. I will doing live demos creating solutions using the HTTP Data Collector API as well as the Custom View Designer. If you are from Melbourne, I encourage you to attend. I am also planning to record this session and publish it on YouTube later.

Lastly, if you have any suggestions for this PowerShell module, please feel free to contact me!

ConfigMgr OMS Connector

Written by Tao Yang

Earlier this week, Microsoft has release a new feature  in System Center Configuration Manager 1606 called OMS Connector:

image

As we all know, OMS supports computer groups. We can either manually create computer groups in OMS using OMS search queries, or import AD and WSUS groups. With the ConfigMgr OMS Connector, we can now import ConfigMgr device collections into OMS as computer groups.

Instead of using the OMS workspace ID and keys to access OMS, the ConfigMgr OMS connector requires an Azure AD Application and Service Principal. My friend and fellow Cloud and Data Center Management MVP Steve Beaumont has blogged his setup experience few days ago. You can read Steve’s post here: http://www.poweronplatforms.com/configmgr-1606-oms-connector/.  As you can see from Steve’s post, provisioning the Azure AD application for the connector can be pretty complex if you are doing it manually – it contains too many steps and you have to use both the old Azure portal (https://manage.windowsazure.com) and the new Azure Portal (https://portal.azure.com).

To simplify the process, I have created a PowerShell script to create the Azure AD application for the ConfigMgr OMS Connector. The script is located in my GitHub repository: https://github.com/tyconsulting/BlogPosts/tree/master/OMS

In order to run this script, you will need the following:

  • The latest version of the AzureRM.Profile and AzureRM.Resources PowerShell module
  • An Azure subscription admin account from the Azure Active Directory that your Azure Subscription is associated to (the UPN must match the AAD directory name)

When you launch the script, you will firstly be prompted to login to Azure:

image

Once you have logged in, you will be prompted to select the Azure Subscription and then specify a display name for the Azure AD application. If you don’t assign a name, the script will try to create the Azure AD application under the name “ConfigMgr-OMS-Connector”:

SNAGHTMLc560723

This script creates the AAD application and assign it Contributor role to your subscription:

image

At the end of the script, you will see the 3 pieces of information you need to create the OMS connector:

  • Tenant
  • Client ID
  • Client Secret Key

You can simply copy and paste these to the OMS connector configuration.

Once you have configured the connector in ConfigMgr and enabled SCCM as a group source, you will soon start seeing the collection memberships being populated in OMS. You can search them in OMS using a search query such as “Type=ComputerGroup GroupSource=SCCM”:

image

Based on what I see, the connector runs every 6 hours and any membership additions or deletions will be updated when the connector runs.

i.e. If I search for a particular collection based on the last 6 hours, I can see this particular collection has 9 members:

image

During my testing, I deleted 2 computers from this collection few days ago. If I specify a custom range targeting a 6-hour time window from few days ago, I can see this collection had 11 members back then:

image

This could be useful sometimes when you need to track down if certain computers have been placed into a collection in the past.

This is all I have to share today. Until next time, enjoy OMS Smile.

Calculating SQL Database DTU for Azure SQL DB Using PowerShell

Written by Tao Yang

over the last few weeks, I have been working on a project related to Azure SQL Database. One of the requirements was to be able to programmatically calculate the SQL Database DTU (Database Throughput Unit).

Since the DTU concept is Microsoft’s proprietary IP, the actual formula for the DTU calculation has not been released to the public. Luckily, Microsoft’s Justin Henriksen has developed an online Azure SQL DB DTU Calculator, you can also Justin’s blog here. I was able to use the web service Justin has developed for the online DTU Calculator, and I developed 2 PowerShell functions to perform the calculation by invoking the web service. The first function is called Get-AzureSQLDBDTU, which can be used to calculate DTU for individual databases, the second function is called Get-AzureSQLDBElasticPoolDTU, which can be used to calculate DTU for Azure SQL Elastic Pools.

Obviously, since we are invoking a web service, the computer where you are running the script from requires Internet connection. Here’s a sample script to invoke the Get-AzureSQLDBDTU function:

Note: you will need to change the variables in the ‘variables’ region, the $LogicalDriveLetter is the drive letter for the SQL DB data file drive.

The recommended Azure SQL DB service tier and coverage % can be retrieved in the ‘Recommendations’ property of the result:

image

the raw reading for each perf sample can be retrieved in the ‘SelectedServiceTiers’ property of the result:

image

Lastly, thanks Justin for developing the DTU calculator and the web service, and pointing me to the right direction.

HybridWorkerToolkit PowerShell Module Updated to Version 1.0.3

Written by Tao Yang

Few days ago, I published a PowerShell Module to be used on Azure Automation Hybrid Workers called HybridWorkerToolkit. You can find my blog article HERE.

Yesterday, my good friend and fellow CDM MVP Daniele Grandini (@DanieleGrandini) gave me some feedback, so I’ve updated the module again and incorporated Daniele’s suggestions.

This is the list of updates in this release:

  • A new array parameter for New-HybridWorkerEventEntry called “-AdditionalParameters”. This parameter allows users to insert an array of additional parameters to be added in the event data:

SNAGHTMLb6e7547

  • A new Boolean parameter for New-HybridWorkerEventEntry called “-LogMinimum”. This is an optional parameter with the default value of $false. When this parameter is set to true, other than the user specified messages and additional parameters, only the Azure Automation Job Id will be logged as event data:

image

As we all know, we pay for the amount of data gets injected into our OMS workspace, this parameter allows you to minimise the size of your events (thus saves money on your OMS spending).

I have published this new release to both GitHub and PowerShell Gallery.

New PowerShell Module HybridWorkerToolkit

Written by Tao Yang

HybridWorkerToolkit23/04/2016 Update: released version 1.0.3 to GitHub and PowerShell gallery. New additions documented in this blog post.

21/04/2016 Update: updated GitHub and PowerShell gallery and released version 1.0.2 with minor bug fix and updated help file.

Introduction

Over the last few days, I have been working on a PowerShell module for Azure Automation Hybrid Workers. I named this module HybridWorkerToolkit.

This module is designed to run within either a PowerShell runbook or a PowerShell workflow runbook on Azure Automation Hybrid Workers. It provides few functions that can be called within the runbook. These activities can assist gathering information about Hybrid Workers and the runbook runtime environment. It also provides a function to log structured events to the Hybrid Workers Windows Event Logs.

My good friend and fellow MVP Pete Zerger posted a method he developed to use Windows event logs and OMS as a centralised logging solution for Azure Automation runbooks when executed on Hybrid Workers. Pete was using the PowerShell cmdlet Write-EventLog to log runbook related activities to Windows event log and then these events will be picked up by OMS. Log Analytics. This is a very innovative way of using Windows event logs and OMS. However, the event log entries written by Write-EventLog are not structured are lacking basic information about your environment and the job runtime.  Couple of weeks ago, another friend of mine, Mr. Kevin Holman from Microsoft also published a PS script that he used to write to Windows event logs with additional parameters.

So I combined Pete’s idea with Kevin’s script, as well as some code I’ve written in the past for Hybrid Workers, and developed this module.

Why do we want to use Windows Event logs combined with OMS for logging runbook activities on Hybrid workers? As Pete explained on this post, it provides a centralised solution where you can query and retrieve these activity logs for all your runbooks from a single location. Additionally, based on my experience (and also confirmed with few other friends), is that when you use Write-Verbose or Write-Output in your runbook and enabled verbose logging, the runbook execution time can increase significantly, especially when loading a module with a lot of activities. Based on my own experience, I’ve seen a runbook that would normally takes a minute or two to run with verbose logging turned off ended up ran over half an hour after I enabled verbose logging. This is another reason I’ve developed this module so it gives you an alternative option to log verbose, error, process and output messages.

Functions

This module provides the following 3 functions:

  • Get-HybridWorkerConfiguration
  • Get-HybridWorkerJobRuntimeInfo
  • New-HybridWorkerRunbookLogEntry

Note: Although the job runtime are different between PowerShell runbooks and PowerShell Workflow runbooks, I have spent a lot of time together with Pete making sure we can use these activities exactly the same ways between PowerShell and PowerShell workflow runbooks.

Get-HybridWorkerConfiguration

This function can be used to get the Hybrid Worker and Microsoft Monitoring Agent configuration. A hash table is returned the following configuration properties retrieved from Hybrid Worker and MMA agent:

  • Hybrid Worker Group name
  • Automation Account Id
  • Machine Id
  • Computer Name
  • MMA install root
  • PowerShell version
  • Hybrid Worker version
  • System-wide Proxy server address
  • MMA version
  • MMA Proxy URL
  • MMA Proxy user name
  • MMA connected OMS workspace Id

Get-HybridWorkerJobRuntimeInfo

This function retrieves the following information about the Azure Automation runbook and the job run time. They are returned in a hashtable:

  • Runbook job ID
  • Sandbox Id
  • Process Id
  • Automation Asset End Point
  • PSModulePath environment variable
  • Current User name
  • Log Activity Trace
  • Current Working Directory
  • Runbook type
  • Runbook name
  • Azure Automation account name
  • Azure Resource Group name
  • Azure subscription Id
  • Time taken to start runbook in seconds

New-HybridWorkerRunbookLogEntry

This function can be used to log event log entries. By default, other than the event message itself, the following information is also logged as part of the event (placed under the <EventData> XML tag:

  • Azure Automation Account Name
  • Hybrid Worker Group Name
  • Azure Automation Account Resource Group Name
  • Azure Subscription Id
  • Azure Automation Job Id
  • Sandbox Id
  • Process Id
  • Current Working Directory ($PWD)
  • Runbook Type
  • Runbook Name
  • Time Taken To Start Running in Seconds

This function also has an optional Boolean parameter called ‘-LogHybridWorkerConfig’ When this parameter is set to $true, the event created by this function will also contain the following information about the Hybrid Worker and MMA:

  • Hybrid Worker Version
  • Microsoft Monitoring Agent Version
  • Microsoft Monitoring Agent Install Path
  • Microsoft Monitoring Agent Proxy URL
  • Hybrid Worker server System-wide Proxy server address
  • Microsoft OMS Workspace ID

Sample Runbooks

Sample PowerShell Runbook:

Sample PowerShell Workflow Runbook

As you can see, the way to call these functions between PowerShell and PowerShell Workflow runbooks are exactly the same.

Hybrid Worker Configuration output:

SNAGHTML40e35ad

Hybrid Worker Job Runtime Info output:

SNAGHTML40f4d28

Event generated (with basic information / without setting –LogHybridWorkerConfig to $true):

SNAGHTML4159a60[4]

Event generated (whensetting –LogHybridWorkerConfig to $true):

SNAGHTML4150515

Consuming collected events in OMS

Once you have collected these events in OMS, you can use search queries to find them, and you can also create OMS alerts to notify you using your preferred methods.

Searching Events in OMS

i.e. I can use this query to get all events logged by a particular runbook:

Type=Event “RunbookName: Test-HybridWorkerOutput-PSW”

image

or use this query to get all events for a particular job:

Type=Event “JobId: 73A3827D-73F8-4ECC-9DE1-B9340FB90744”

image

OMS Alerts

i.e. if I want to create an OMS alert for any Error events logged by New-HybridWorkerRunbookLogEntry, I can use a query like this one:

Type=Event Source=AzureAutomation?Job* EventLevelName=Error

image

image

Download / Deploy this module

I have published this module on Github as well as PowerShell Gallery:

GitHub Repository: https://github.com/tyconsulting/HybridWorkerToolkit

PowerShell Gallery:  http://www.powershellgallery.com/packages/HybridWorkerToolkit/1.0.3

Credit

I’d like to thank Pete and Kevin for the ideas in the first place, also I’d like to thank Pete, Jakob Svendsen, Daniele Grandini and Kieran Jacobsen for the testing and feedback!