Tag Archives: MimboloveSCOM Reporting

Creating a Data Source for the Operational DB on OpsMgr 2012 R2 Reporting Server

Written by Tao Yang

Creating a data source to for the OpsMgr Operational DB is a very common practice. This is required for many 3rd party OpsMgr reports. Kevin Holman blogged the instruction here.

In my case, I’m creating a data source called OperationalDataBaseMain for my favourite report MP SCC Health Check Management Pack. Other than the data source name difference between Kevin’s blog and what’s required for SCC Health Check MP, Kevin configured the data source with the option “Credentials are not required”, which is essentially using the SSRS service account (DW Reader account). In the SCC Health Check MP documentation, it suggests to use the option “Windows integrated security”. Back then when I configured this data source in OpsMgr 2007, both options worked.

Today I was trying to configure this data source on our newly built OpsMgr 2012 R2 test management group where the Operational DB is hosted by SQL server, Data Warehouse and Reporting Server is hosted on SQL server B. Both SQL servers are running SQL 2012 SP1. I tried both authentication methods, none of them worked straightaway.

“Windows integrated security” option

When I chose this option, the test connection was successful, but when I tried to run a report that’s targeting the operational DB, I got this error (and my user ID is a member of the OpsMgr admin group):

image

“Credentials are not required” option

When I chose this option, the Test Connection was unsuccessful and I got this error:

image

Login failed. the login is from an untrusted domain and cannot be used with Windows authentication.

Both SQL servers and the DW reader service account is located in the same domain.

Furthermore, an event was logged on the Operational DB SQL server’s security log:

image

This log entry indicates the DW reader account does not have “logon from network” (logon type 3) rights.

Then I found out “Access this computer from the network” rights is restricted by a GPO to the following groups and the DW reader account is not a member of any of them:

image

So I added the DW reader service account to the local “Users” group on the SQL server hosting operational DB, tried to establish the connection again in the data source, this time, I got another error:

image

Login failed for user <DW Reader service account>.

It’s different message, so I checked SQL security, the DW Reader account was not listed as a user in SQL. I checked the SQL server for the OpsMgr 2007 environment, this account is added in SQL, been given public role and it’s not mapped to the operational DB. So I replicated this configuration on the SQL server hosting OpsMgr 2012 R2 operational DB, tested connection again in SSRS, now I have another different error:

image

Cannot open database “OperationsManager” requested by the login. the login failed. Login failed for user <DW Reader service account>.

Also, I got a similar message in SQL log:

image

Login failed for user ‘<DW Reader account>’. Reason: Failed to open the explicitly specified database ‘OperationsManager’. [CLIENT: <SSRS Server’s IP address>]

So I modified the user mapping for the DW reader account, given it “public” role for the OperationsManager DB

image7

After this modification, I was able to connect the data source:

image

However, after I tried to run a report targeting the operational DB, I got another error:

image

I then modified the permission for DW Reader account within the Operational DB again. this time I’ve given the “db_datareader” role:

image

Finally, after this, the report successfully ran:

image

Summary

To summarise, In my environment, the DW Reader account needs to be given the following rights on the remote SQL server hosting operational DB to be able to setup the data source in SCOM reporting server’s SSRS instance:

  • Logon from the network rights
  • the “public” role for the SQL DB Engine
  • “public” and “db_datareader” role for the OpsMgr operational DB.

Note:

Although I have checked 2 identical OpsMgr 2012 R2 test management groups we have here and DW reader account does not have any SQL rights on the operational DB SQL server by default in both management groups, I cannot verify / confirm if this scenario is by design. In my home lab, because both operational and DW databases are on the same SQL, I did not have to modify permissions for DW reader account at all.

I did not consult with any Microsoft people. This is purely just my personal experience. If you have any doubts, please do not implement this in your environment.

“Data Aggregation” field became empty in OpsMgr 2007 linked Performance Report

Written by Tao Yang

Today I needed to change some parameters in a linked performance report in one of the OpsMgr 2007 R2 management groups at work. When I opened the report property, the Data Aggregation field somehow became blank and greyed out:

image

As the result, when I tried to run the report, I get this error:

image

I found a blog article from Dieter Wijckmans “SCOM 2007: Scheduled reports failing”, it indicates it’s because there are duplicate management group ID specified in the report parameters. Dieter’s fix doesn’t really apply to me as the my report is not a scheduled report, however, my approach is much easier.

below is what I’ve done:

1. log on to the RMS box, and run below PowerShell script to get the management group ID:

$RMS = $env:computername
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.EnterpriseManagement.OperationsManager.Common") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.EnterpriseManagement.OperationsManager") | Out-Null

$MGConnSetting = New-Object Microsoft.EnterpriseManagement.ManagementGroupConnectionSettings($RMS)
$MG = New-Object Microsoft.EnterpriseManagement.ManagementGroup($MGConnSetting)
$MG.Id

image

Note:

If RMS is clustered, replace “$env:computername” to the RMS cluster name in the first line.

2. export the management pack (assuming the linked report is stored in a unsealed MP), open the unsealed MP in a text editor

3. Go to

<Reporting>

<LinkedReports>

<LinkedReport ID= (where ID is the Id of the problematic report)

<Parameters>

Find the ManagementGroupId parameter and delete the incorrect value

image

4. Save the XML and import the unsealed MP back to the management group.

After importing the MP back, the “Data Aggregation” field is populated:

image

Note:

I can also change the report parameter in SSRS web site:

image

image

However, by directly modifying report in SSRS, the fix is only temporary. the original MP is not fixed and it will over write the report definition in SSRS. I’ve tried to update SSRS directly, the report got changed back shortly after.

OpsMgr 2012 App Advisor Web Console Failed To Run Reports After DW Database Move

Written by Tao Yang

Previously I’ve blogged my experience in migrating from OpsMgr 2012 RTM to SP1 which involved moving all databases to a new database server running SQL 2012. Since then, I’ve noticed the instruction in moving Operational DB published in TechNet is actually missing a step, which I’ve blogged here.

Over the last few days, since it was the long weekend down here in Victoria, I spent some time setting up the Talking Heads web application to explorer APM in OpsMgr 2012. I was following the instructions in Chapter 9 in the Mastering System Center 2012 Operations Manager book. The instructions in this chapter is exactly the same (almost word to word) as the 3-part series in Kevin Greene’s blog (Part 1, Part 2, Part 3). – Looks like we know who’s the author for this chapter. – Thanks Kevin Smile

Anyways, I’ve learnt a lot from this chapter as I’ve never dealt with Avicode in 2007 or APM in 2012. While I was exploring the App Diagnostic and App Advisor consoles, I’ve noticed my App Advisor console won’t run any reports! No matter which reports I tried to run, I always get this error:

image

and as suggested, I got this warning event (EventID 1309) from Windows Application log:

image

In the exception stack, it shows I cannot connect to the SQL server:

Exception message: An error has occurred during report processing. —> Microsoft.ReportingServices.ReportProcessing.ProcessingAbortedException: An error has occurred during report processing. —> Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: Cannot create a connection to data source ‘AppMonitoringSource’. —> System.Data.SqlClient.SqlException: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: Named Pipes Provider, error: 40 – Could not open a connection to SQL Server)
at System.Web.Services.Protocols.SoapHttpClientProtocol.ReadResponse(SoapClientMessage message, WebResponse response, Stream responseStream, Boolean asyncCall)

After spending time trying to fix this issue, I created a DNS Alias record with my old OpsMgr SQL server name pointing to the new SQL server, and it fixed the problem! By doing this, I knew there is one place that I haven’t updated the SQL server name yet. I went through the Technet instructions for moving Operational DB, DW DB and Reporting Server again, I couldn’t find which step was I missing.

Today, I had a second look in the SSRS instance for OpsMgr reporting, I noticed there is an additional data source called AppMonitoringSource under Application Monitoring –>.Net Monitoring. the connection string for this data source was still pointing to the old SQL server:

image

According to the Technet guide “How to Move the Data Warehouse Database”, as per step 8, only the Data Warehouse Main data source needs to be updated (and of course, if we’ve manually created another data source for operational DB, which is very common and lots of community report MPs requires this datasource, we’d also update it too.). The AppMonitoringSource was not mentioned in the instructions – which I found interesting because step 10 actually covered changing the DB server name in dbo. MT_Microsoft$SystemCenter$DataWarehouse$AppMonitoring, table, we can tell by name is used for APM.

As result, after I’ve changed the connection string in this data source, deleted the DNS alias and flushed DNS cache on my web console server, the problem is now fixed.

So, not sure if the AppMonitoringSource was overlooked by the author of the Technet guide. For anyone who has moved the DW DB in OpsMgr 2012, it’s probably worth checking this data source (by going to http://<reporting server>/reports). I only found out now only because I’ve started using APM in my environment.

“Cannot initialize report” Error When Running SCOM Reports

Written by Tao Yang

Yesterday, I noticed I get this error when I tried to run some reports in the System Center Central Health Check Reports MP in my home SCOM 2007 R2 environment:

“Value of 01-Jan-01 12:00:00 AM” is not valid for ‘value’. ‘Value’ should be between ‘MinDate’ and ‘MaxDate’.

image

I then realised I get exactly the same error on any reports which contain datetime pickers in the report.

image

Long story short, after spending some time troubleshooting this issue, including updating my SCOM 2007 R2 environment from CU5 to CU6 as this blog post has suggested:

SCOM 2007 R2 – Reporting : Value of ’1/01/0001 12:00:00 AM’ is not valid for ‘Value’. ‘Value’ should be between ‘MinDate’ and ‘MaxDate’

CU6 update didn’t fix my issue. The issue end up being the regional setting on my SCOM all-in-one box (RMS, databases and reporting server).

I live in Australia, so normally, I would set the language format to English (Australia). However, when I create a new management pack using SCOM 2007 R2 Authoring Console on a computer set to use English (Australia) language format, by default, 2 language packs are created in the management pack: ENU (English United States) and ENA (English Australia) and default language pack is set to ENA. I don’t always remember to change the default and delete ENA language pack for every management pack I’m working on. So,to work around this issue, on the 2 machines that I run Authoring Console from (one being the all-in-one SCOM server), I set the language format to English (United States) and modify the short date format from default M/d/yyyy to dd-MMMM-yy so I’m not confused with the month and day when I read a date. this is the cause of the error in SCOM reports.

Default English (United States) formats:

image

Modified formats:

image

After I changed the format back to default on my SCOM server, the reports started running! Since all the SCOM server roles are on the same box, I can’t confirm if which components (i.e. RMS, SSRS etc.) relies on the default language formats.

Using SCOM To Count Logs and Produce Reports

Written by Tao Yang

Recently, I’ve been asked twice to produce daily reports involves counting some kind of logs:

Scenario 1:

The support team need to count number of Application event log entries of events with a specific event ID. A daily report is required to list the number for each computer.

Scenario 2:

An application produces a log file each day. The support team need to count the number of a specific phrase appeared in previous day’s log file. A daily report is required to list the count number for each computer.

The solution I produced for both scenarios are very similar. so I thought I’d blog this one.

Solution from High level View:

  1. Create a rule in the SCOM management pack to run once a day.
  2. Write a script within a rule in the SCOM management pack to count the log
  3. map the count number to performance data and save it in the SCOM operational and data warehouse DB.
  4. design a report for raw performance data in SQL SRS report builder
  5. save the report into the management pack
  6. schedule the report to run and to be emailed out once a day, AFTER the rule has run for the day.

In this blog post, I’m not going to go through the steps of creating the custom data source module and the performance collection rule. They are pretty straightforward and the sample management pack can be downloaded HERE.

I will however go through the steps to create the custom report for the data collected by this rule. I’m creating the report rather than using the built-in performance reports from the “Microsoft Generic Report Library” because none of the built-in performance reports support a table format. I don’t want any fancy charts with the report. All I want is a simple list of the raw perf counter values.

Now, let’s briefly go through the data source module and the performance collection rule.

Data Source Module: contains 2 members: System.Scheduler and Microsoft.Windows.PowerShellPropertyBagTriggerOnlyProbe:

image

The Microsoft.PowershellPropertyBagTriggerOnlyProbe contains a powershell script that counts event log entries and pass the count into a PropertyBag:

#===========================================================================================
# AUTHOR:  Tao Yang
# DATE:    30/01/2012
# Version: 1.0
# COMMENT: Count for a particular event in event log and pass the count to property bag
#===========================================================================================
Param ([int]$TimeFrameInHours, [string]$LogName, [int]$EventID, [string]$EventSource)

$StartTime = (Get-Date).AddHours(-$TimeFrameInHours)
$iEventCount = 0
Try {
$Events = Get-EventLog -LogName $LogName -After $StartTime -Source $EventSource | Where-Object {$_.EventID -eq $EventID}
Foreach ($Event in $Events)
{
If ($Event -ne $null) {$iEventCount++}
}
} Catch {
$iEventCount = 0
}
$ComputerName = (Get-WmiObject Win32_ComputerSystem).Caption
$oAPI = New-Object -ComObject "MOM.ScriptAPI"
$OAPI.LogScriptEvent("Event-Count.PS1",9999,0,"Start EventID $EventID Perf Collection Rule. Collecting $EventID events since $starttime...")
$oBag = $oAPI.CreatePropertyBag()
$oBag.AddValue('ComputerName', $ComputerName)
$oBag.AddValue('EventCount', $iEventCount)
$oBag.AddValue('TimeFrameInHours', $TimeFrameInHours)
$oBag.AddValue('LogName', $LogName)
$oBag.AddValue('EventID', $EventID)
$oBag.AddValue('EventSource', $EventSource)
$oBag

Performance Collection Rule: This rule contains:

Data Source: the data source module created previously

Condition Detection: map the event log count in PropertyBag to performance counter

Actions: Write performance data to Operational and DW databases.

image

Report:

Pre-requisites:

  • Install the Performance Report Model in SCOM reporting SSRS. Here’s a detailed instruction (even though it was written for SCOM 2007 SP1, it’s also applies to SCOM 2007 R2): http://www.systemcentercentral.com/BlogDetails/tabid/143/IndexID/20269/Default.aspx
  • Please Note that in above article, it uses Event model as example. The report I’m going to create uses Performance model. so please make sure Performance.smdl is uploaded into SCOM Reporting SSRS and configured to use the “Data Warehouse Main” data source.
  • Import the half finished management pack (with the data source module and the perf collection rule) into a SCOM management group (preferably your development environment).
  • Create an override or simply change the schedule of the rule to run ASAP so the perf data is collected. this is very useful when testing the report later on.

Steps of creating the report:

01.Browse to the SCOM Reporting SSRS reports http://<servername>/reports URL

02. Launch Report Builder and click “Run” if security warning pops up

image

03. In Report Builder, choose the following options in “Getting Started” pane to create a new report:

image

04. Enter the report title:

image

05. Drag “Performance Data Raw into the report

image

06. Under Performance Data Raw / Object, Drag the “Name” field to the reportimage

07. Rename the title of each row in the report table:

image

08. Right click the number under “Event Count”, select “Format…”, and change “Decimal places” to 0

image

09. Click the Filter button to create filters:

image

10. Under Performance Data Raw \ Performance Rule Instance \ Performance Rule, drag the “Rule System Name” Field to the right and choose the rule I created in the management pack from the list. (Note: the rule name appears on the list because the management pack is already imported into SCOM and this rule has already collected some performance data.)

image

11. Click on Performance Data Raw and drag “Date Time” field to the right

image

12. Click on “equals” next to “Date Time” and change it to “After”:

image

13. Choose “(n) days ago”

image

14. Change “(n)” to “2”

image

15. Click OK to exit the Filter Data window

16. Now, it’s time to test run the report. To do so, use the Run Report button on the top. Here’s the result from my test environment (Note: the date time is in UTC, NOT local time):

image

17. If you want to make the report prettier (i.e. changing the font colour to pink Smile with tongue out) or adjust the column width, or adding a company logo, you can click on “Design Report” button and modify the report.

18. Once you are happy with the report, save it to a RDL (report definition) file:

image

19. Open up the half finished management pack (unsealed) in Authoring Console, go to Reporting workspace and create a new report:

image

20. Give the report an ID:

image

21. In the “General” tab, give the report a name and target it to “Microsoft.Windows.Computer” class

image

22. Go to “Definition” tab, click “Load content from file” and select the RDL file you’ve just created.

image

23. Once the RDL file is loaded, remove the first line, which is the XML header <?xml version=”1.0″ encoding=”utf-8″?>

image

24. Once the first line is removed, go to “Options” tab

25. Make sure “Visible” is set to “true” and “Accessibility” is set to “public”

image

26. click apply and OK to exit the window

27. Now that the report is successfully created and tested, if you have changed the schedule of the perf collection rule (either edited the rule directly or created an override), it’s time to change the schedule back.

28. Now, if you want to keep the management pack unsealed, just export the updated management pack with the report into SCOM management group from authoring console. If you want to seal it, do so, and delete the previous unsealed version from the management group first, then import the sealed version into the management group.

I always increase the version number so I can lookup Event ID 1201 in SCOM agent’s Operations Manager log and make sure the updated version of the MP is received:

image

29. After couple of minutes, if everything goes well, you should be able to see the report in both Operations Console Reporting workspace and also in SCOM Reporting SSRS site:

image

image

Note: In SSRS, you should also see a .mp file in the same folder. I’ve experienced issues where the report does not get updated with the updated MP, which was caused by incorrect .mp file in SSRS directory. Please refer to my previous post for details.

30. Schedule the report in SCOM reporting (so it can be emailed out according to a schedule) if you want to. make sure the report schedule is AFTER the rule schedule time (i.e. if the rule runs daily at 0:00am, the report schedule should be something like daily at 0:30am) otherwise newly collected data is not included in the report.

That concludes the steps to create the report. Few other things I’d also like to mention:

  1. In my case, for the second scenario I mentioned in the beginning (reading log files), the whole process and idea is the same. The only thing different is the script in the Data Source module.
  2. I could have moved the condition detection module (System.Performance.DataGenericMapper) from the rule to the data source module. I didn’t do it because then I can use the same data source module for other purposes later. For example, if later on, the support team comes to me and ask me to generate alerts once the count reaches a threshold, I can simply create a separate rule (or a custom monitor type and a monitor), using the same data source. If the input parameters of the data source is the same as the existing performance collection rule, the data source should only run once for multiple workflows because of the Cookdown feature.
  3. If the SCOM agent computer is in maintenance mode when the perf collection rule is scheduled to run, no perf data will be collected and the computer will be missing from the report.
  4. In my example, I’m using a PowerShell script. So PowerShell and it’s execution policy needs to be installed / enabled on the SCOM agent computers. if this doesn’t meet your requirement, just modify the module to use a VBscript instead. I’ve blogged previously on how to create trigger only probe action modules for VBScript.

Again, the sample MP and the Report Definition RDL file can be downloaded HERE.

Reports not updated in SCOM SQL Reporting Service When the Management Pack was Updated

Written by Tao Yang

I ran into an issue today. I have updated a report in a management pack. After I updated the version number, sealed it and imported the updated management pack into SCOM, the report that I have modified did not get updated in SQL Reporting Service (SRS).

Generally, once a new MP is imported into a management group, within few minutes, the reports within the MP should be deployed to SRS. This was the case when I updated the very same MP in Development environment, but in Production, I waited few hours and nothing has happened.

After few hours, I finally fix the issue.

For any reports that have been deployed as part of a MP, there should be a .mp file in the SRS folder, like this one:

image

In the production environment, the .mp file name from my management pack folder in SRS is different than the one in development environment. I checked other management packs in both Prod and Dev and they all have the same .mp file name.

To fix the issue: I deleted the .mp file from SRS, restarted the SRS service. Within one minute, the updated report got deployed to SCOM SQL Reporting Service and the .mp file got recreated as well.