Category Archives: PowerShell

Programmatically Performing OMS Log Search Against a Large Result Set

Written by Tao Yang

When performing OMS log search programmatically, you will encounter an API limitation that will prevent you from getting all the logs from the result set. Currently, if the search does not include an aggregation command, the API call will return maxium 5000 records. This limitation applies to both the OMS PowerShell module (AzureRM.OperationalInsights) and searching directly via the Log Search API.

The return response you get from either the Get-AzureRmOperationalInsightsSearchResults cmdlet or the Log Search API, you will get the total number of logs contained in the result set from the response metadata (as shown below), but you will only able to receive up to 5000 records. Natively, there is no way to receive anything over the first 5000 records from a single request.

image

Last month, I was working on a solution where I needed to retrieve all results from search queries, so I reached out to the OMS product group and other CDM MVPs. My buddy and the fellow co-author of the Inside OMS book Stanislav Zhelyazkov provided a work around. Basically, the work around is to use the “skip” command in subsequent request calls until you have retrieved everything. For example, if you want to retrieve all agent heartbeat events using query “Type=Heartbeat”, you could perform multiple queries until you have retrieved all the log entries as shown below:

  1. 1st Query: “Type=Heartbeat | Top 5000”
  2. 2nd Query: “Type=Heartbeat | Skip 5000 | Top 5000”
  3. 3rd Query: “Type=Heartbeat | Skip 5000 | Top 5000”
  4. … repeat until the search API call returns no results

I have written a sample script using the OMS PowerShell module to demonstrate how to use the “skip” command in subsequent queries. The sample script is listed below:

Here’s the script output based on my lab environment:

image

Using Azure Key Vault as the Password Repository For You and Your Team

Written by Tao Yang

Over the past decade, I have used several password management applications such as Password Safe, KeePass and LastPass. Out of these products, only LastPass is cloud based. I have been hesitate to use LastPass over the last few years and stayed with KeePass because of the LastPass data breach back in 2015. Few months ago, my friend Alex Verkinderen finally convinced me to start using LastPass again. But this time, in order to be more secure and being able to use Multi-Factor Authentication (MFA), I have purchased a premium account and also purchased a YubiKey Neo for MFA. I understand not everyone is willing to spend money on password repository solutions (in my case, USD $12 per year for the LastPass Premium account and USD $50 + shipping for a Yubikey Neo from Amazon). Also, based on my personal experience, there are still many organisations that don’t have a centralised password repositories. Many engineers and consultants I have met still store passwords in clear text.

On the other hand, Azure Key Vault has drawn a lot of attention since it was released and it is become really popular. I have certainly used it a lot over the last few months and managed to integrate it with many solutions that I have built.

AzureKeyVaultPasswordRepo PowerShell Module

I spent few hours last night and today, developed a PowerShell CLI menu based app based on few existing scripts I wrote in the past. This app allows you to create, manage Azure Key Vault and use it as your personal (or team’s) password repository. In order to simplify the process of deploying and using this app, I wrapped it in a PowerShell module. I named this module AzureKeyVaultPasswordRepo and it is now available on both PowerShell Gallery and GitHub:

PowerShell Gallery: https://www.powershellgallery.com/packages/AzureKeyVaultPasswordRepo/

GitHub: https://github.com/tyconsulting/AzureKeyVaultPasswordRepo-PSModule/releases/tag/1.0.0

If you are running PowerShell version 5 and later, you can install this module using an one-liner:

Install-Module AzureKeyVaultPasswordRepo

Once it is installed, you can launch the app either using the full name Invoke-AzureKeyVaultPasswordRepository, or use one of the 2 shorter aliases (ipr and Start-PasswordRepo).

Initial Setup

This module requires AzureRm.Profile, AzureRm.Resources and AzureRm.KeyVault modules, which you can also find from the PowerShell Gallery.

When it is launched, it will detect if you are currently Signed in to Azure and ask you if you want to keep using the same account if you are currently signed in.

image

you have the option to keep using the current account or sign in to Azure using another account.

Then the app will prompt you to use the current Azure subscription that’s set in the context, or select another subscription from the list.

When running it for the first time, you will need to create a new Key Vault from the menu. You can choose an existing resource group, or create a new resource group in your azure region of your choice

image

Once the key vault is created, you will need to assign full access to an Azure AD account. This is done by searching Azure AD using a search string and select an user account from the search result list.

image

Once the permission is assigned, everything is ready to go. you will be presented with the main menu:

image

Note: It is by design that this app does not use any existing key vaults that you may already have in your subscription. You have to create a new one. Any existing key vaults that are not created by this app will not appear on the list for you to choose.

Creating Profile to store settings

In order to make you access this key vault as fast as possible in the future, the first thing I’d suggest you to do is to select option 4 and save the Azure subscription Id and Key Vault name in your profile. this profile is stored in Windows Registry under HKEY_CURRENT_USER\SOFTWARE\TYConsulting\AzureKeyVaultPasswordRepo\Profiles\<your Azure account name>.

image

image

Once the profile is saved, when you launch this app next time, it will automatically use the Azure subscription and the Key Vault that’s stored in the profile.

SNAGHTML6c6b2b3

Creating Credentials

From the main menu, you have the option to:

1. Create new credential (user name and password). you also have the option to generate random password by not entering a password. if you choose to use this app to generate a random password, the password will be copied to the computer’s clipboard once the credential is created (so you can use Ctrl-V to paste it to wherever you need to).

image

List, Retrieve, Update and Delete Credentials

You can use Option 2 to list, retrieve, update and delete existing credentials. When option 2 is selected, the app will list all credentials stored in the key vault, and from there, you can choose the credential from the list that you are interested in. Once the credential is selected, you have the option to:

  1. Copy user name to clipboard
  2. Copy password to clipboard
  3. Update credential (username / password)
  4. Delete Credential

image

Search Credential

Instead of selecting a credential to manage from the list, you can also search credentials (based on credential name) using option 3.

Save / Delete Profile

As shown previously, for faster access, you can use option 4 to store the Azure subscription Id and the Key Vault Name in the registry. if you decide to delete the profile (i.e. when you decide to use another subscription or key vault), you can use option 5 to delete the existing profile from registry.

Managing Key Vault Access

You can use option 6 to grant full access to the key vault to other Azure AD accounts, or use option 7 to remove access.

Conclusion

Personally I’m pretty happy to see what I have produced during such a short period of time (only few hours based on some existing scripts I wrote few weeks ago). I think this would fill a gap for people and organisations that do not have a commercial password management solution.

Azure Key Vault is a very in-expensive solution, and by using an Azure offering, you automatically inherit the MFA solutions that you have configured for Azure / Azure AD. i.e. I’m not using Azure AD premium for my lab but for my Microsoft (@outlook.com) account, I have enabled MFA using the Microsoft Authenticator app. Therefore in order to access the Key Vault using this module, I will need to use MFA during the sign in process.

I’ve only spent few hours on this PowerShell module, there are still room for improvement. So consider this as a MVP (Minimum Viable Products). I think the following additions would be beneficial in future releases (if I decide to have develop further):

  • A GUI interface
  • Support additional types of sensitive information, not just username and passwords
  • Support Service Principal (Azure AD Applications)
  • Support different levels of access (currently everyone has full access)

Lastly, please give it a try, and I’d like to hear back from the community. If you are interested to learn how to interact with Key Vault using PowerShell, feel free to read the source code of this module. if you have any questions or suggestions, please feel free to contact me!

Managing Azure Automation Module Assets Using MyGet

Written by Tao Yang

SNAGHTML2756703

Background

Managing the life cycle of PowerShell module assets in your Azure Automation accounts can be challenging. If  you are currently using Azure Automation, you may have already noticed the following behaviours when managing the module assets:

1. It is difficult to automate the module asset deployment process.

If you want to automate the module deployment to your Automation Account (i.e. using the PowerShell cmdlet New-AzureRmAutomationModule), you must ensure the module that you are trying to import is zipped into a zip file and located on a public location where Azure Automation can read via HTTP (i.e. Azure Blob storage). In my opinion, this is over complicated.

2. Modules are not deployed to the Hybrid Workers automatically

If you are using Hybrid Workers, you must also manage the modules separately. Unlike Azure runbook workers, Azure Automation does not automatically deploy modules to Hybrid Workers. This means when you import a module to your Azure Automation account, you must also manually deploy it to your Hybrid Worker computers.

3. Difficult to maintain module version consistencies.

Since managing modules in your Azure Automation accounts and hybrid workers are two separate processes, it is hard to make sure the versions of your module assets are consistent between your automation account and hybrid worker computers.

Over the past few months, I have invested lot of my time on MyGet and looking for ways to close these gaps. Few months ago, I have released a PowerShell DSC Resource module called cPowerShellPackageManagement (http://blog.tyang.org/2016/09/15/powershell-dsc-resource-for-managing-repositories-and-modules/). By using this DSC resource module, we can easily develop DSC configurations for computers (such as Hybrid Workers) to automatically install modules from a PowerShell module repository (i.e. a MyGet feed). This approach closes the gaps of managing Hybrid Worker computers (item #2 on the list above). Today, I am going to discuss how we can tackle item #1 and #3. Before I start talking about my solutions, let me quickly introduce MyGet first.

What is MyGet?

Myget (www.myget.org) is a SaaS based package repository hosted on the cloud. It supports all the popular package providers such as NuGet, Npm etc. It can host both private and public repository (called a feed) for you or your organisation.

If you come from a developer or DevOps background, you may have already heard about MyGet in the past, or have used similar on-premises package repositories (such as ProGet). If you are an IT Pro, since you are reading this blog post right now, you must be familiar with PowerShell, therefore must have heard or used PowerShell Gallery (https://powershellgallery.com). You can use MyGet the same way as PowerShell Gallery in PowerShell version 5 and later, except you have absolute control of the content in your feeds. Also,  if you are using a paid MyGet account, you can have private feeds and you can control the access by issuing API keys. You can also create multiple feeds that contain different packages (PowerShell modules in this case). i.e. if you develop PowerShell modules, you can have a Dev feed for you to use during development, and also Test and Production feeds for testing and production uses.

Why Do I Need MyGet?

You may be a little bit hesitate to use PowerShell Gallery because it is 100% public. As a regular user like everyone else, you can only do very little. i.e. you can publish modules to PowerShell gallery, but you can’t guarantee your modules will stay there forever. Microsoft may decide to un-list your modules if they find problems with it (i.e. failed to comply with the rules set in the PSScriptAnalyzer). You also don’t have access to delete your modules from PowerShell Gallery. You can un-list your modules, but they are still hosted there. To me, PowerShell Gallery is more like a community platform that allows everyone to share their work, but you should not use it in any production environments because you don’t have any controls on the content, how can you make sure the content you need is going to be there tomorrow?

MyGet allows you to create feeds that you have total control, and as I mentioned already, with a paid MyGet account, you can have private feeds to host your IPs that you don’t want to share with the rest of the world.

MyGet also ships with other awesome features, such as Webhook support.

Automating Module Deployment to Automation Account

I have developed a runbook that retrieves a list of modules from a repository (i.e. your MyGet feed), and import each module to the Automation account of where the runbook resides, if the module does not exist or the version is lower than the latest available version from the module repository. Before importing, the runbook also tries to work out the module dependencies and import required modules in groups (i.e. the modules without dependencies are imported first).  Here’s the runbook source code:

Note: this runbook does not download and zip up PowerShell modules from the repository feed. Instead, it construct the URI to the underlying NuGet package and import the package directly to your automation account.

In order to use the runbook, you will need to create a automation variable first.

Name: ModuleFeedLocation

Value: <the source location URI to your repository feed>

image

Note: if you are not sure what is the source location URI for your feed, check out this help document from MyGet website: http://docs.myget.org/docs/how-to/publish-a-powershell-module-to-myget. However, I don’t believe the documentation is 100% accurate. Based on my experience, no matter if you are using private or public feeds, the Source location URI should be:

https://www.myget.org/F/<feed-name>/auth/<api-key>/api/v2

The API key is available on the MyGet portal:

image

if  you have connected to the feed as a PowerShell repository, you can also check using Get-PSRepository cmdlet:

image

Other than the automation variable, you will also need to make sure you have the AzureRunAsConnection connection asset and associated certificate asset created. these assets are created automatically by default when you created your Azure Automation account:

image

If you don’t have this connection asset, you can manually create it using PowerShell – this process is documented here: https://docs.microsoft.com/en-au/azure/automation/automation-sec-configure-azure-runas-account

Once the runbook and all required assets are in place, you will also need to create a webhook for the runbook. It is OK to configure the webhook to target Azure workers (although targeting hybrid worker group is also OK, but not necessary).

image

Once the webhook is created, go to MyGet portal, go to your feed then go to the Webhook section and add a HTTP POST webhook

image

then enter a description and paste the runbook webhook URL. for the webhook trigger, only tick “Package Added”:

image

image

Once the webhook trigger is created, everything is good to go. when next time you add a PowerShell module or update an existing module on your MyGet feed, it will automatically trigger the Azure Automation runbook, which will find the modules need to be imported and updated, and attempt to import them one a time.

image

Tips:

Once you have configured your MyGet feed as a PowerShell repository on a computer running PowerShell v 5 or later, you can publish modules located on your local computer to the feed using Publish-Module cmdlet. You can also configure MyGet to get modules from another repository such as PowerShell Gallery. I have blogged this previously: http://blog.tyang.org/2016/09/20/pushing-powershell-modules-from-powershell-gallery-to-your-myget-feeds-directly/

If you want to configure multiple Automation accounts to sync with a single MyGet feed, you can simply create the runbook and required assets in each automation account, and add a webhook trigger for each instance of the runbook within your MyGet feed.

Things to Watchout

there are few things that you need to watch out when using this solution:

1. be aware of the limitations in Azure Automation

Some of these limitations may impact your module imports. you can find the official documentation here:  https://docs.microsoft.com/en-us/azure/azure-subscription-service-limits#automation-limits

2. Unlike any NuGet repositories such as PowerShell Gallery and MyGet, Azure Automation does not support storing different versions of same module

This may cause some of the module imports to fail. For example, if you have a module called ModuleA (version 1.0) that is a dependency to ModuleB version 1.0. You have ModuleA 1.0, ModuleB 1.0 and 2.0 in your MyGet repository, the runbook will firstly import ModuleB 2.0 to your automation account first. then when it tries to import ModuleA 1.0, it may fail because it does not pass the validation test (by importing ModuleA 1.0 on the runbook worker computer). so prior to committing these kind of packages to a feed that’s being used by Azure Automation, make sure you test it first on another feed, and make sure you can successfully install and import the module on your local computer.

3. Do not load too many modules to the feed initially

Module import into Azure Automation account takes a lot of time. when running a runbook job on Azure workers, the runbook can run maximum 3 hours due to its fair share policy. so if you have a lot of modules to load in the beginning, you need to make sure the runbook job can be completed within 3 hours. or you may have to rerun the runbook to pickup the modules didn’t get imported in the previous runbook job. Alternatively, you can configure the runbook to run on a Hybrid Worker group, because the fair share policy does not apply when the job is being executed on hybrid workers.

Conclusion

If you use a dedicated MyGet feed to host all required modules for Azure Automation, you can use the cPowerShellPackageManagement DSC resource module I mentioned earlier in this blog post to automate the module deployment to Hybrid Workers. In the same time, by using the method described in this blog post, you have also got the Automation account covered.

Therefore, if you have both DSC configured for Hybrid Workers (i.e using Azure Automation DSC), and have this runbook and webhook configured, by adding a new package to your MyGet feed, your entire Azure Automation infrastructure is updated automatically.

My MVP buddy Alex Verkinderen also also done some interesting integration between MyGet and PowerShell Gallery. He is going to publish his innovation on his blog (http://www.mscloud.be/) soon, so make sure you subscribe to his blog Smile.

Lastly, thanks Alex for testing the runbook for me, and if anyone has any questions or suggestions, please feel free to contact me.

PowerShell Script to Import and Update Modules from PowerShell Repositories to Azure Automation

Written by Tao Yang

PowerShell Gallery has a very cool feature that allows you to import modules directly to your Azure Automation Account using the “Deploy to Azure Automation” button. However, if you want to automate the module deployment process, you most likely have to firstly download the module, zip it up and then upload to a place where the Azure Automation account can access via HTTP. This is very troublesome process.

I have written a PowerShell script that allows you to search PowerShell modules from ANY PowerShell Repositories that has been registered on your computer and deploy the module DIRECTLY to the Azure Automation account without having to download it first. You can use this script to import new modules or updating existing modules to your Automation account.

This script is designed to run interactively. You will be prompted to enter details such as module name, version, Azure credential, selecting Azure subscription and Azure Automation account etc.

The script works out the URI to the actual NuGet package for the module and import it directly to Azure Automation account. As you can see from above screenshot, Other than the PowerShell Gallery, I have also registered a private repository hosted on MyGet.org, and I am able to deploy modules directly from my private MyGet feed to my Azure Automation account.

If you want to automate this process, you can easily make a non-interactive version of this script and parameterize all required inputs.

So, here’s the script, and feedback is welcome:

cPowerShellPackageManagement DSC Resource Updated to Version 1.0.1.0

Written by Tao Yang

Few days ago I found a bug in the cPowerShellPackageManagement DSC resource module that was caused by the previous update v1.0.0.1.

in version 1.0.0.1, I’ve added –AllowClobber switch to the Install-Module cmdlet, which was explained in my previous post: http://blog.tyang.org/2016/12/16/dsc-resource-cpowershellpackagemanagement-module-updated-to-version-1-0-0-1/

However, I only just noticed that despite the fact that the pre-installed version of the PowerShellGet module on Windows Server 2016 and in WMF 5.0 for Windows Server 202 R2, the install-module cmdlet is sightly different. The pre-installed version of PowerShellGet module is 1.0.0.1, and in Windows 10 and Windows Server 2106, Install-Module cmdlet has the “AllowClobber” switch:

image

In Windows Server 2012, the Install-module cmdlet does not have –AllowClobber switch:

image

Therefore I had to update the DSC resource to detect the if AllowClobber switch exists.

Additionally, I have made few additional stability improvements, and added dependency to the PowerShellGet module in the module manifest file.

This updated version can be found on both GitHub and PowerShell Gallery:

Github: https://github.com/tyconsulting/PowerShellPackageManagementDSCResource/releases/tag/1.0.1.0

PowerShell Gallery: https://www.powershellgallery.com/packages/cPowerShellPackageManagement/1.0.1.0

PowerShell Script to Create OMS Saved Searches that Maps OpsMgr ACS Reports

Written by Tao Yang

Microsoft’s PFE Wei Hao Lim has published an awesome blog post that maps OpsMgr ACS reports to OMS search queries (https://blogs.msdn.microsoft.com/wei_out_there_with_system_center/2016/07/25/mapping-acs-reports-to-oms-search-queries/)

There are 36 queries on Wei’s list, so it will take a while to manually create them all as saved searches via the OMS Portal. Since I can see that I will reuse these saved searches in many OMS engagements, I have created a script to automatically create them using the OMS PowerShell Module AzureRM.OperationalInsights.

So here’s the script:

You must run this script in PowerShell version 5 or later. Lastly, thanks Wei for sharing these valuable queries with the community!

OMSDataInjection Updated to Version 1.2.0

Written by Tao Yang

The OMSDataInjection module was only updated to v1.1.1  less than 2 weeks ago. I had to update it again to reflect the cater for the changes in the OMS HTTP Data Collector API.

I only found out last night after been made aware people started getting errors using this module that the HTTP response code for a successful injection has changed from 202 to 200. The documentation for the API was updated few days ago (as I can see from GitHub):

image

This is what’s been updated in this release:

  • Updated injection result error handling to reflect the change of the OMS HTTP Data Collector API response code for successful injection.
  • Changed the UTCTimeGenerated input parameter from mandatory to optional. When it is not specified, the injection time will be used for the TimeGenerated field in OMS log entry.

If you are using the OMSDataInjection module, I strongly recommend you to update to this release.

PowerShell Gallery: https://www.powershellgallery.com/packages/OMSDataInjection

GitHub: https://github.com/tyconsulting/OMSDataInjection-PSModule/releases/tag/v1.2.0

DSC Resource cPowerShellPackageManagement Module Updated to Version 1.0.0.1

Written by Tao Yang

Back in September this year, I published a PowerShell DSC resource called cPowerSHellPackageManagement. This DSC resource allows you to manage PowerShell repositories and modules on any Windows machines running PowerShell version 5 and later. you can read more about this module from my previous post here: http://blog.tyang.org/2016/09/15/powershell-dsc-resource-for-managing-repositories-and-modules/

Couple of weeks ago my MVP buddy Alex Verkinderen had some issue using this DSC resource in Azure Automation DSC. After some investigation, I found there was a minor bug in the DSC resource. When you use this DSC resource to install modules, sometimes you may get an error like this:

image

Basically, it is complaining that a cmdlet from the module you are trying to install already exists. In order to fix it, I had to update the DSC resource and added –AllowClobber switch to the Install-Module cmdlet.

I have published the updated version to both PowerShell Gallery (https://www.powershellgallery.com/packages/cPowerShellPackageManagement/1.0.0.1) and GitHub (https://github.com/tyconsulting/PowerShellPackageManagementDSCResource/releases/tag/1.0.0.1)

If you are using this DSC resource at the moment, make sure you check out the update.

OMSDataInjection PowerShell Module Updated

Written by Tao Yang

I’ve updated the OMSDataInjection PowerShell module to version 1.1.1. I have added support for bulk insert into OMS.

Now you can pass in an array of PSObject or plain JSON payload with multiple log entries. The module will check for the payload size and make sure it is below the supported limit of 30MB before inserting into OMS.

You can get the new version from both PowerShell Gallery and GitHub:

PowerShell Gallery: https://www.powershellgallery.com/packages/OMSDataInjection/1.1.1

GitHub: https://github.com/tyconsulting/OMSDataInjection-PSModule/releases/tag/1.1.1

PowerShell Module for Managing Azure Table Storage Entities

Written by Tao Yang

Azure Storage - TableIntroduction

Firstly, apologies for not being able to blog for 6 weeks. I have been really busy lately.  As part of a project that I’m working on, I have been dealing with Azure Table storage and its REST API over the last couple of weeks. I have written few Azure Function app in C# as well as some Azure Automation runbooks in PowerShell that involves inserting, querying and updating records (entities) in Azure tables. I was struggling a little bit during development of these function apps and runbooks because I couldn’t find too many good code examples and I personally believe this REST API is not well documented on Microsoft’s documentation site (https://docs.microsoft.com/en-us/rest/api/storageservices/fileservices/table-service-rest-api). Therefore I have spent the last two days developed a PowerShell module for managing the lifecycle of the Azure Table entities. This module can be used to perform CRUD (Create, Read, Update, Delete) operations for Azure Table entities.

AzureTableEntity PowerShell Module

This PowerShell module is named as AzureTableEntity, it can be located in both GitHub and PowerShell Gallery:

This module offers the following 4 functions:

Get-AzureTableEntity Search Azure Table entities by specifying a search string.
New-AzureTableEntity Insert one or more entities to Azure table storage.
Update-AzureTableEntity Update one or more entities to Azure table storage.
Remove-AzureTableEntity Remove one or more entities to Azure table storage.

Note: All functions have been properly documented in the help file. you can use Get-Help cmdlet to access the help file.

Get-AzureTableEntity

By default when performing query operation, the Azure Table REST API only returns up to 1000 entities or all entities returned from search within 5 seconds. This function has a parameter ‘-GetAll’ that can be used to return all search results from a large table. The default value of this parameter is set to $true.

The search result returned by the search API is deserialised. As the result, complex data type such as datetime is returned as string. If you want any datetime fields from the search result returned as original datetime field, you can set the “-ConvertDateTimeFields” parameter to $true. Please note this would potentially increase the script execution time when dealing with a large set of search result.

Hint: You can easily build your search string using the Azure Storage Explorer.

New-AzureTableEntity

This function can be used to insert a single entity or bulk insert up to 100 entities (and the total payload size is less than 4MB).

Please make sure both “PartitionKey” and “RowKey” are included in the entity. The data type for these fields must be string.

i.e. Instead of setting RowKey = 1, you should set RowKey = “1” – because the value for both PartitionKey and RowKey must be a string.

Update-AzureTableEntity

This function can be used to update a single entity or bulk update up to 100 entities (and the total payload size is less than 4MB).

Please note when updating an entity, all fields (including the fields that do not need to be updated) must be specified. It is actually a merge operation. If you are modifying an existing entity returned from the search operation (Get-AzureTableEntity) and the entity contains datetime fields, please make sure you set “-ConvertDateTimeFields” parameter to $true when performing the search in the first place. Please also be aware that the built-in Timestamp field must not be included in the entity fields.

Remove-AzureTableEntity

This function can be used to remove a single entity or bulk remove up to 100 entities (and the total payload size is less than 4MB).

Support for Azure Automation and SMA

To simply leveraging this module in Azure Automation or SMA, I have included a connection object in the module:

image

Once you have created the connection objects, instead of specifying storage account, table name and storage account access key, you can simply specify the connection object using ‘-TableConnection’ parameter for all four functions.

Sample Code

I have published some sample code I wrote when developing this module to GitHub Gist:

Summary

I wrote this module so I can simplify my Azure Automation runbooks and make IT Pro’s life easier when working on Azure Table storage. If you have to deal with Azure Table storage, I hope you find this module useful. If you are a developer and looking for code samples, you can still use this module and simply translate the code to the language of your choice.

I purposely didn’t include any functions for managing the Azure table storage itself because you can manage the Table storage using the Azure.Storage module.

Lastly, feedbacks are always welcome, so please drop me an email if you have any.