Category Archives: Authoring

Authoring SCOM VSAE

Service Discovery and Monitoring with Operations Manager

Published by:

One of the most frequent requests we get from customers is to create monitor for application services. Often enough you will find management packs for well know applications, but if you can’t, you will need to create those by yourself. With that, you have basically two options: use the provided Authoring Template in the SCOM console, which has been extensively described on the internet or create monitors in the same authoring area of SCOM, but using an existing target, like Windows Operating system or Windows Computer.

The first option is good because SCOM will not only monitor the services, but it will also create a discovery for those services and make them available to be listed as independent objects in a State view, for example. The cons of this approach is that if you have a lot of services, a lot of work will be required to create all the monitors. It also uses a lot more resources to discover the services, since for each monitored service, you a discovery will be added. This template is also good if you want CPU and and memory monitoring for the services, which are available through the template as well.

With the second option, which much leaner in terms o resources, the con is that the services themselves do not become objects themselves. The monitors for each one of them will be visible in the Health Explorer only. Alerts will work normally though.

What should you do then?

Well, there is a third option, which will require some XML edition and authoring skills. I’ve been using this for different customers and it has a good feedback. To build this solution, I’m using Visual Studio 2015 with the Management Pack Authoring extensions.

It all starts with a Class definition:

<ClassType ID=”Company.Application.Class.Computer” Accessibility=”Public” Abstract=”false” Base=”Windows!Microsoft.Windows.ComputerRole” Hosted=”true” Singleton=”false” />

This one defines a computer class that will host the services. And now the services themselves:

<ClassType ID=”Company.Application.Class.Service” Accessibility=”Public” Abstract=”false” Base=”Windows!Microsoft.Windows.LocalApplication” Hosted=”true” Singleton=”false”>
  <Property ID=”ServiceName” Type=”string” Key=”false” CaseSensitive=”false”  MaxLength=”256″ MinLength=”0″ />
  <Property ID=”ServiceDisplayName” Type=”string” Key=”true” CaseSensitive=”false” MaxLength=”256″ MinLength=”0″ />
  <Property ID=”ServiceProcessName” Type=”string” Key=”false” CaseSensitive=”false” MaxLength=”256″ MinLength=”0″ />
  <Property ID=”StartMode” Type=”string” Key=”false” CaseSensitive=”false” MaxLength=”256″ MinLength=”0″ />
  <Property ID=”LogOnAs” Type=”string” Key=”false” CaseSensitive=”false” MaxLength=”256″ MinLength=”0″ />
</ClassType>

Next, I will need two discoveries, one to discover the computers and then, another one to discover the services. This could be condensed in a single script discovery, but WMI is less expensive than scripts in terms or CPU cycles.

First the computer discovery:

image

Make sure you pick the right service prefix in the WMI query part, to properly identify the computers that belong to that class.

This discovery will then scan all computers that are part of the Windows Server Operating System Class every 15 minutes. Once one machine with that services mentioned above is found, a new instance of the Company.Application.Class.Computer class will be created.

And the service discovery itself:

image

This discovery will scan all the previously discovered computers that belong to the Company.Application.Class.Computer class  and look for the services according to the WMI query. Once any of the services is found, a new member of the Company.Application.Class.Service is discovered and the properties are mapped:

image

Having Service objects as entities by themselves makes it easy to monitor, since you can only create a single monitor that targets all the objects:

image

And that is pretty much it. The remaining pieces of the MP references, presentation and display strings. Make sure to customize the IDs and messages according to your needs.

The final MP can be found here.

Hope this helps!

Authoring SCOM

Yet another update to the Extended Agent Info Management Pack

Published by:

imageI have recently updated my Extended agent info MP to include information about Operations Management Suite. I have now added a task to configure the agent to use OMS.

The new task shows in the tasks pane when you click on any agent or agents in the Extended Agents View:

image

Once clicked, you will need to override the name (should be Guid, I know) of the workspace and the key to that workspace:

image

Once configured, click override and then Run. Once completed, the agent (as long as it supports OMS, version 7.2 and higher), it will be configured as below:

image

I have also fixed an issue where once a management group was removed, the Monitoring Service wouldn’t start. I have found this great piece of code Here from Matty T and have incorporated the technology. Thanks Matty!

The new version can be found here!

Authoring SCSM

SCSM–A tale of two customizations

Published by:

imageThis is a story with a happy ending. Not all Service Manager Data Warehouse customization stories end the same way. Embark with me in this incredible journey!

Once upon a time, there was a System Center Service Manager installation that required some customizations. A reference to a list of customers and the scope of the customizations was required. And so it happened. Management Pack 1 was created and implement. Victory!

Months later, filled with courage, a new modification was requested. This time, with new properties and interface customizations. And so it happened. Management Pack 2 came to be.

The previous endeavors, however, were not enough and more customizations were required. To accomplish them, not only new fields needed to be created, but the interface had to be modified. As you may know or not, Service Manager will only support a single management pack with customizations for a certain form. In light of that and considering the prior customizations were not being used fully, it was agreed that part of the previous modifications, MP2, was to be removed and redesigned. And that’s when the problems began.

As soon as MP2 was removed and the Data Warehouse jobs started to run, errors showed up:

clip_image002

clip_image004

clip_image006

And the funny part is that the errors referred to Columns defined on MP1 (not MP2, that was removed) as invalid. That puzzled all the wizards of the realm. Again, since the service requests were not being used fully, it was decided that MP1 and all related MPs were to be removed. To start fresh.

However, that didn’t help. The errors continued. And no data was being loaded into the data warehouse, not even regarding other types of workitems.

In the past, the villagers had asked for the help of the mighty gods of Service Manager and many times had heard: “Thou shalt drop the data warehouse and attach a new one”.

Fearing for the worst, the frightened SCSM admin decided to re-import both management packs. And reboot the DW servers, of course. That had a strange effect. Now the load jobs would run, but still, no transform job. And even more weird was that the properties generating the errors were now the ones defined in MP2:

image

Despair and fear took over the administrator. The council had already ruled that the best would be to sacrifice 3 years of data to save the daily operations and reporting. In a last and desperate attempt, the SCSM admin removed all the MPs once more. Rebooted the Data Warehouse server. Poured a glass of the finest Ale and waited.

And then the miracle came. No more errors in the data warehouse jobs. All fully synchronized. All the data backlog (8 days) was synchronized successfully. A great relief took over the NOC and all the analysts danced, feasted and drank (soft drinks) all night to celebrate. And they lived happily ever after.

It might not be something that you can do in your installation, but it fixed our problem with the data.

True story.

Hope this helps!

PS: an old enemy was lurking in the dark: the Cubes! But that is another story.

Authoring SCSM

SCSM–Using PowerShell to Create DW Outriggers and Dimensions

Published by:

image

Update (Dec,3rd 2015): Found a noob/doingthislateatnight bug in the script. Fixed it. Now the option to create dimension or not works. Nothing like eating your own dog food…

Certain technologies have such complexity when we first look at them that, as Carl Sagan said before, they are undistinguishable from magic. However, as history has proven, if you spend time to study and understand, these phenomena won’t be so mystifying anymore. For me, although working wit SCSM for a while, send custom information to the Data Warehouse was always a point I would avoid as much as I could. However, often enough, there is no way around it.

So, I have decided to face the challenge and not only produce one-offs. I wanted something I could use multiple times.

With that said, I’m not covering all possibilities here. Service Manager has endless opportunities for expansion, which won’t always be easy to implement. Here’s the scenario I wanted to have ready, since it happens often enough in SCSM Implementations:

– Customer needs a new field in a form, typically Incidents or Service Request. Something like a list of customers, internal departments,etc.

– You use the Authoring Console to create the new fields and lists

– Once the new (sealed) MP is imported, you can see text fields in the warehouse DB, but all the lists (enumerations) contain GUIDs.

To fix this issue, Service Manager requires you to create what is called an Outrigger, very well described here. It is however a dry subject and it takes a while to really take off.

Let’s look at an example:

Here’s my new class property, on top of incidents. I actually have a couple more, but I’m focusing on the Clients one:

image

If you look at the database (data warehouse):

image

Notice Clients is there, but:

image

Not too helpful. Enters the outrigger:

image

It is actually simpler than it looks, but can be confusing. The key part is the line:

<Attribute ID=”Clients” PropertyPath=”$Context/Property[Type=’ref1!ClassExtension_ced5b84f_54a9_4ff5_b681_0d071e879d94′]/Clients$” />

To compose that line, you need to make sure you have a reference (call ref1 here) to the MP that you used to extend the class.

image

You need to know the PublicKeyToken as well to write this manually.

You also need to know the name of the extension of the (incident) class from your original MP:

image

Fortunately, you can get all this information using PowerShell and that’s exactly what I did in my script.

The script also allows you to generate a Dimension, which is some sort of new compose class in the DW. There is a flag in the beginning of the script.

Once generated, the script will also try to seal the MP according to the configuration in the header of the script:

image

Once you run it, it will prompt you for the class you need to create the outrigger from:

image

It looks a bit messy, but PowerShell gridview allows you to quickly search, so, note the Identifier on the right:

image

Once you select and click ok, you’ll be prompted for the field or fields you want to add as outriggers:

image

Again, a bit messy to look, but effective. Smile

And there you have it.

Now you have a new table in the database (DWDatamart):

image

That you can relate by the EnumTypeId field:

image

And you can use for parameters in Reporting Services. Neat, huh?

You can find the script here!

 

Points for improvement – there are a lot more things that can be done, including facts tables, for relationships, more options, error control and Cube data (next version), but this is version 1.0 and I think it will help with a common/simple task.

A few tips: once you import your new class, make sure you synchronize the DW once before importing the Outrigger extensions. You can use this script from Travis Wright to speed up the process. Also make sure you run it on a Service Manager Posh window, as administrator. If fastseal fails for some reason, you HAVE to fix it in order to seal the MP. You can’t use unsealed MPs in the Data Warehouse.

 

Hope this helps!

Authoring SCOM

Issue with SCOM Run As Account

Published by:

Recently had an issue with my custom fileshare monitor but I believe it can happen to any Run As Account/Profile. My MP has a run as profile, to run the PowerShell commands:

image

When installing this at a customer, we have re-purposed an existing Run As Account, by changing the account credentials. The Account was then assigned to my Run As Profile.

image

However, the monitor wouldn’t work. Bummer! I had that tested extensively in my lab. And it is a simple monitor. So, I have added more debug to the script:

image

It will then show the logged on user while running the command.

image

For my (big surprise), the account running the monitoring was the account set before the re-purposing. And yes, it had been almost four days, so, not a case of waiting for the MPs to be updated in the agent.

So, quick solution: create a new Run As Account and assign it to the MP’s run as profile.

Fixed!

Moral of the story: you can’t always trust what it says in the run as account credentials configuration. There must an issue that needs to be looked at. Maybe by clearing the Health Store, it will download the correct information.

Hope this helps!

 

Take the time and get an Azure subscription or and MSDN subscription, as well as a night at the movies if you are in Canada!

Authoring SCOM

SCOM Distributed Application Object Location

Published by:

Often enough I find myself asked where can certain types of objects be found in SCOM when creating a Distributed Application. It seems straightforward but the location of some of them can take you a few minutes to find. So here goes a summary of objects I find useful:

Windows Computer

Object->

Configuration Item->

Logical Entity->

Device->

Computer->

Windows Computer

Web Application Monitors

Object->

Configuration Item->

Logical Entity->

Perspective->

Web Application Perspective

Web Availability Monitors

Object->

Configuration Item->

Logical Entity->

Perspective->

Web Application Availability Monitoring Test Base

SQL Jobs

Object->

Configuration Item->

Logical Entity->

Application Component->

Windows Application Component->

SQL Component->

SQL Agent Job

Windows Services

Object->

Configuration Item->

Logical Entity->

Local Application->

Windows Local Application->

Windows Local Service->

Windows Service

Distributed Applications (User Created)

Object->

Configuration Item->

Logical Entity->

Service->

User Created Distributed Application

TCP Ports

Object->

Configuration Item->

Logical Entity->

Perspective->

TCP port check Perspective

Databases (SQL)

Object->

Configuration Item->

Logical Entity->

Application Component->

Database->

SQL Database

Clusters

Object->

Configuration Item->

Logical Entity->

Group->

Windows Cluster

Hope this helps!

 

Subscribe to Azure and enjoy the Cloud Computing model!

Also try MSDN and take your chance to get a night at the movies!

Authoring Powershell SCOM

Quick and Dirty: a handy SQL Query PS Rule

Published by:

Very often, I get the request to monitor a remote SQL Server as a Synthetic Transaction and I normally end up creating something custom, even though SCOM has an OLE DB Template, which can come in handy, but it can create a bit of overhead from a class and objects perspective. The approach here is to create a disabled rule, target to a  common and existing class and enable it whenever necessary on the computers selected to be the watcher nodes. Doesn’t look as neat as an entry in the console, but, hey,do you really need to know how sausages are made to enjoy the hotdog?

Let’s get down to it. One thing that is often forgotten is authentication: when running the query, integrated auth is usually a good choice and having a Run as profile to assign an account for the query will sure be need.

1.For that, you will need a Secure Reference and its respective display string.

<TypeDefinitions>

<SecureReferences>
      <SecureReference ID=”ABC.Application.RunAsProfileSQLQueries” Accessibility=”Public” Context=”Windows!Microsoft.Windows.Computer” />
</SecureReferences>

</TypeDefinitions>

<DisplayStrings>

<DisplayString ElementID=”ABC.Application.RunAsProfileSQLQueries”>
  <Name>ABC Application RunAsProfile for SQL Queries</Name>
</DisplayString>
</DisplayStrings>

2.Since this is a custom rule, let’s start by creating a Scripting Probe:

  <TypeDefinitions>

<ModuleTypes>

      <ProbeActionModuleType ID=”ABC.Application.Probe.GenericSQLQueryPS” Accessibility=”Public” RunAs=”ABC.Application.RunAsProfileSQLQueries” Batching=”false” PassThrough=”false”>
        <Configuration>
          <xsd:element minOccurs=”1″ name=”SQLInstance” type=”xsd:string” />
          <xsd:element minOccurs=”1″ name=”Database” type=”xsd:string” />
          <xsd:element minOccurs=”1″ name=”strQuery” type=”xsd:string” />
        </Configuration>
        <ModuleImplementation Isolation=”Any”>
          <Composite>
            <MemberModules>
              <ProbeAction ID=”Probe” TypeID=”Windows!Microsoft.Windows.PowerShellPropertyBagTriggerOnlyProbe”>
                <ScriptName>PSSQLProbe.ps1</ScriptName>
                <ScriptBody><![CDATA[param([string]$SQLInstance,[string]$strQuery,[string]$Database)

$oAPI = New-Object -ComObject “MOM.ScriptAPI”
$oBag = $oAPI.CreatePropertyBag()

$strServer = “$SQLInstance”
$SQLQuery=”$strQuery”
$oAPI.LogScriptEvent(“PSSQLProbe.ps1″, 555,0,”Preparing query against $SQLInstance on Database $Database with query: $SQLQuery .”)
$ADOCon = New-Object -ComObject “ADODB.Connection”
$oResults = New-Object -ComObject “ADODB.Recordset”
$adOpenStatic = 3
$adLockOptimistic = 3
$ADOCon.Provider = “sqloledb”
$ADOCon.ConnectionTimeout = 60
$nowInUTC = (Get-Date).ToUniversalTime()
$conString = “Server=$strServer;Database=$Database;Integrated Security=SSPI”
try {
    $ADOCon.Open($conString)
}
catch {
    $oAPI.LogScriptEvent(“PSSQLProbe.ps1″, 555,1,”Error connecting. Constring: $conString Error: $error”)
}
if ($ADOCon.State -ne 0)
{
   
    $time=measure-command {
        try {    
           
            $oResults.Open($SQLQuery, $ADOCon, $adOpenStatic, $adLockOptimistic)
            $oAPI.LogScriptEvent(“PSSQLProbe.ps1″, 555,0,”Successfully executed query against $SQLInstance on Database $Database”)
            If (!$oResults.EOF)
            {
                $oBag.AddValue(‘RecordCount’,$oResults.RecordCount)
                $oBag.AddValue(‘TransactionTimeMS’, $time.Milliseconds)

            }
            else
            {
                $oBag.AddValue(‘RecordCount’,0)
                $oBag.AddValue(‘TransactionTimeMS’, $time.Milliseconds)
            }
            $oBag
        }
        catch
        {
           #write-host “Error running query”
           $oAPI.LogScriptEvent(“PSSQLProbe.ps1″, 555,1,”Error executing query against $SQLInstance on Database $Database with query $SQLQuery”)
        }
    }
    $oResults.Close()
    $ADOCon.Close()
}]]></ScriptBody>
                <Parameters>
                  <Parameter>
                    <Name>SQLInstance</Name>
                    <Value>$Config/SQLInstance$</Value>
                  </Parameter>
                  <Parameter>
                    <Name>strQuery</Name>
                    <Value>$Config/strQuery$</Value>
                  </Parameter>
                  <Parameter>
                    <Name>Database</Name>
                    <Value>$Config/Database$</Value>
                  </Parameter>
                </Parameters>
                <TimeoutSeconds>60</TimeoutSeconds>
              </ProbeAction>
            </MemberModules>
            <Composition>
              <Node ID=”Probe” />
            </Composition>
          </Composite>
        </ModuleImplementation>
        <OutputType>System!System.PropertyBagData</OutputType>
        <TriggerOnly>true</TriggerOnly>
      </ProbeActionModuleType>

  </ModuleTypes>
</TypeDefinitions>

<DisplayStrings>

<DisplayString ElementID=”ABC.Application.Probe.GenericSQLQueryPS”>
  <Name>ABC Application Probe Generic SQL Query PS</Name>
</DisplayString>

</DisplayStrings>

The probe has the 3 basic parameters: SQL Instance or server, database and SQL Query.

3. With the probe ready, you need a data source module:

<TypeDefinitions>

<ModuleTypes>

<DataSourceModuleType ID=”ABC.Application.DataSource.GenericSQLQueryPS” Accessibility=”Public” RunAs=”ABC.Application.RunAsProfileSQLQueries” Batching=”false”>
  <Configuration>
    <xsd:element minOccurs=”1″ name=”IntervalSeconds” type=”xsd:integer” />
    <xsd:element minOccurs=”0″ name=”SyncTime” type=”xsd:string” />
    <xsd:element minOccurs=”1″ name=”SQLInstance” type=”xsd:string” />
    <xsd:element minOccurs=”1″ name=”Database” type=”xsd:string” />
    <xsd:element minOccurs=”1″ name=”strQuery” type=”xsd:string” />
  </Configuration>
  <OverrideableParameters>
    <OverrideableParameter ID=”SQLInstance” Selector=”$Config/SQLInstance$” ParameterType=”string” />
    <OverrideableParameter ID=”Database” Selector=”$Config/Database$” ParameterType=”string” />
    <OverrideableParameter ID=”strQuery” Selector=”$Config/strQuery$” ParameterType=”string” />
    <OverrideableParameter ID=”IntervalSeconds” Selector=”$Config/IntervalSeconds$” ParameterType=”int” />
    <OverrideableParameter ID=”SyncTime” Selector=”$Config/SyncTime$” ParameterType=”string” />
  </OverrideableParameters>
  <ModuleImplementation Isolation=”Any”>
    <Composite>
      <MemberModules>
        <DataSource ID=”scheduler” TypeID=”System!System.SimpleScheduler”>
          <IntervalSeconds>$Config/IntervalSeconds$</IntervalSeconds>
          <SyncTime />
        </DataSource>
        <ProbeAction ID=”SQLProbe” TypeID=”ABC.Application.Probe.GenericSQLQueryPS”>
          <SQLInstance>$Config/SQLInstance$</SQLInstance>
          <Database>$Config/Database$</Database>
          <strQuery>$Config/strQuery$</strQuery>
        </ProbeAction>
      </MemberModules>
      <Composition>
        <Node ID=”SQLProbe”>
          <Node ID=”scheduler” />
        </Node>
      </Composition>
    </Composite>
  </ModuleImplementation>
  <OutputType>System!System.PropertyBagData</OutputType>
</DataSourceModuleType>

  </ModuleTypes>
</TypeDefinitions>

<DisplayStrings>

<DisplayString ElementID=”ABC.Application.DataSource.GenericSQLQueryPS”>
  <Name>ABC Application DataSource Generic SQL Query PS</Name>
</DisplayString>
4. Now, the last part: the rule itself, which leverages the Datasource directly and adds a condition detection and an alert action:

<Monitoring>

<Rules>

<Rule ID=”ABC.Application.Rule.SQLQuery.Test” Enabled=”false” Target=”Windows!Microsoft.Windows.Server.OperatingSystem” ConfirmDelivery=”true” Remotable=”true” Priority=”Normal” DiscardLevel=”100″>
  <Category>Custom</Category>
  <DataSources>
    <DataSource ID=”DS” RunAs=”ABC.Application.RunAsProfileSQLQueries” TypeID=”ABC.Application.DataSource.GenericSQLQueryPS”>
      <IntervalSeconds>300</IntervalSeconds>
      <SQLInstance>your_SQL_Server_Instance</SQLInstance>
      <Database>Database</Database>
      <strQuery>SELECT ETC FROM ETC</strQuery>
    </DataSource>
  </DataSources>
  <ConditionDetection ID=”Filter” TypeID=”System!System.ExpressionFilter”>
    <Expression>
      <And>
        <Expression>
          <SimpleExpression>
            <ValueExpression>
              <XPathQuery Type=”String”>Property[@Name=’RecordCount’]</XPathQuery>
            </ValueExpression>
            <Operator>Greater</Operator>
            <ValueExpression>
              <Value Type=”String”>0</Value>
            </ValueExpression>
          </SimpleExpression>
        </Expression>
        <Expression>
          <SimpleExpression>
            <ValueExpression>
              <XPathQuery Type=”String”>Property[@Name=’TransactionTimeMS’]</XPathQuery>
            </ValueExpression>
            <Operator>Less</Operator>
            <ValueExpression>
              <Value Type=”String”>5000</Value>
            </ValueExpression>
          </SimpleExpression>
        </Expression>
      </And>
    </Expression>
  </ConditionDetection>
  <WriteActions>
    <WriteAction ID=”Alert” TypeID=”Health!System.Health.GenerateAlert”>
      <Priority>1</Priority>
      <Severity>2</Severity>
      <AlertMessageId>$MPElement[Name=”AlertMessageID50555aef48434eeea982400717e04b15″]$</AlertMessageId>
      <Suppression>
        <SuppressionValue>$Target/Host/Property[Type=”Windows!Microsoft.Windows.Computer”]/NetbiosComputerName$</SuppressionValue>
      </Suppression>
    </WriteAction>
  </WriteActions>
</Rule>

</Rules>

</Monitoring>

<Presentation>
  <StringResources>
    <StringResource ID=”AlertMessageID50555aef48434eeea982400717e04b15″ />

  </StringResources>

</Presentation>
<DisplayStrings>

<DisplayString ElementID=”AlertMessageID50555aef48434eeea982400717e04b15″>
          <Name>APP SQL Query Error – SQL server unavailable</Name>
          <Description>”No records were return when querying the Server or over time</Description>
        </DisplayString>

</DisplayStrings>

 

And there you go. All you have to do is put all that into Visual Studio Authoring Extensions, mix, stir and poof! You got your MP.

Make sure you read my previous VSAE related posts.

If you just want to get the MP, get it here! Don’t forget to create more rules, for different queries (or create overrides) and to override the rule to enable it to the computers you want to be the watches nodes. I also strongly recommend you to seal the management pack, just so you can update it freely and still keep your overrides.

Hope this helps!

Authoring SCOM

SCOM: Recording a Web Transaction Session with SCOM 2012 R2 64bits Console (Re-post)

Published by:

image

This is just a re-post. I seldom do that, but I believe it is worth it. I find it so cumbersome and clumsy from Microsoft that you have to do all that, so, worth re-posting.

It had been a while since my last Web Transaction recording and my old tricks didn’t work, so, had to look up and found this:

http://thoughtsonopsmgr.blogspot.ca/2014/04/om12-r2-web-recorder-ie11-how-to-get.html

Worked like a charm.

Thanks Marnix for one more.

 

Hope this helps!

Authoring SCOM

Monitoring Page Read/Write in SQL Instances

Published by:

If you are familiar with the Microsoft SQL Management pack for SCOM 2012 (can be found here), you will know that it brings a lot of performance counters out of the box. However, two counters, very often used by SQL administrators to check on the performance of each individual instance is not present by default.

Adding a performance counter collection rule is not a big deal in terms of SCOM authoring. It can even be done in the SCOM Operations Console. In this case, however, you can’t simply target the computer object and determine the counter in a static. Or, you can, but you will need one rule per instance. Here’s why: when SQL has multiple instances running in the same box, a new performance counter object is created for each instance. For example, in a single (non named) instance of SQL, you see a service called MSSQLSERVER on your  SQL Computer. That will generate a target with that name. When you have multiple instances, different services will be created and will have different names, like:

image

Interestingly enough, the service name is not exactly the same name as the performance counter. It works for multiple instances:

image

But when you have a single instance, the counter is named only SQLServer (not MSSQLServer).

image

Luckily, then SCOM discovers the SQL DB instance, it creates a property called Performance Counter Object Name:

image

That really solves the problem. Without that, I would need to manually refer the object name when creating a performance rule. Now, I can simply create a generic rule, targeting all SQL DB Engines and using a dynamic consistent object name. For that, you will need SCOM’s current management pack authoring tool: Visual Studio, with the SCOM Authoring extensions installed (or you’re preferred xml editor!). Here’s my Management pack fragments:

Library Reference:

<Reference Alias=”MSL”>
  <ID>Microsoft.SQLServer.Library</ID>
  <Version>6.5.1.0</Version>
  <PublicKeyToken>31bf3856ad364e35</PublicKeyToken>
</Reference>

 

Now, the rule itself (for Page Reads/Sec):

<Rule ID=”FEHSE.SQLServer.Rule.Performance.PageReadsPerSec” Enabled=”false” Target=”MSL!Microsoft.SQLServer.DBEngine” ConfirmDelivery=”false” Remotable=”true” Priority=”Normal” DiscardLevel=”100″>
        <Category>PerformanceCollection</Category>
        <DataSources>
          <DataSource ID=”DS” TypeID=”Performance!System.Performance.OptimizedDataProvider”>
            <ComputerName>$Target/Host/Property[Type=”Windows!Microsoft.Windows.Computer”]/NetworkName$</ComputerName>
            <CounterName>Page reads/sec</CounterName>
            <ObjectName>$Target/Property[Type=”MSL!Microsoft.SQLServer.DBEngine”]/PerformanceCounterObject$:Buffer Manager</ObjectName>
            <InstanceName />
            <AllInstances>false</AllInstances>
            <Frequency>300</Frequency>
            <Tolerance>0</Tolerance>
            <ToleranceType>Absolute</ToleranceType>
            <MaximumSampleSeparation>1</MaximumSampleSeparation>
          </DataSource>
        </DataSources>
        <WriteActions>
          <WriteAction ID=”WriteToDB” TypeID=”SC!Microsoft.SystemCenter.CollectPerformanceData” />
          <WriteAction ID=”WriteToDW” TypeID=”SCDW!Microsoft.SystemCenter.DataWarehouse.PublishPerformanceData” />
        </WriteActions>
      </Rule>

And the Page Writes/Sec rule:

<Rule ID=”KGC.SQLServer.Rule.Performance.PageWritesPerSec” Enabled=”false” Target=”MSL!Microsoft.SQLServer.DBEngine” ConfirmDelivery=”false” Remotable=”true” Priority=”Normal” DiscardLevel=”100″>
        <Category>PerformanceCollection</Category>
        <DataSources>
          <DataSource ID=”DS” TypeID=”Performance!System.Performance.OptimizedDataProvider”>
            <ComputerName>$Target/Host/Property[Type=”Windows!Microsoft.Windows.Computer”]/NetworkName$</ComputerName>
            <CounterName>Page writes/sec</CounterName>
            <ObjectName>$Target/Property[Type=”MSL!Microsoft.SQLServer.DBEngine”]/PerformanceCounterObject$:Buffer Manager</ObjectName>
            <InstanceName />
            <AllInstances>false</AllInstances>
            <Frequency>300</Frequency>
            <Tolerance>0</Tolerance>
            <ToleranceType>Absolute</ToleranceType>
            <MaximumSampleSeparation>1</MaximumSampleSeparation>
          </DataSource>
        </DataSources>
        <WriteActions>
          <WriteAction ID=”WriteToDB” TypeID=”SC!Microsoft.SystemCenter.CollectPerformanceData” />
          <WriteAction ID=”WriteToDW” TypeID=”SCDW!Microsoft.SystemCenter.DataWarehouse.PublishPerformanceData” />
        </WriteActions>
      </Rule>

The trick is here:

<ObjectName>$Target/Property[Type=”MSL!Microsoft.SQLServer.DBEngine”]/PerformanceCounterObject$:Buffer Manager</ObjectName>

Note that I’m using the PerformanceCounterObject as part of the name of the object, which makes the rule completely generic, no matter which instance it runs against.

 

Make sure to add display Strings:

<DisplayString ElementID=”FEHSE.SQLServer.Rule.Performance.PageReadsPerSec”>
  <Name>FEHSE SQL Server Rule Performance Page Reads Per Sec</Name>
  <Description />
</DisplayString>
<DisplayString ElementID=”FEHSE.SQLServer.Rule.Performance.PageWritesPerSec”>
  <Name>FEHSE SQL Server Rule Performance Page Writes Per Sec</Name>
  <Description />
</DisplayString>

And there you have it. You can download the MP here.

 

Hope this helps!

Authoring SCOM

Using to SCOM to Audit Local Administrators

Published by:

Have you ever wondered or needed to actually know which users were actually set as local administrators on a server or group of servers? You are in luck if those servers are managed by SCOM. It can do that for you!

For that, we will need to put together a few pieces of OpsMgr magic.

First, we need to have a working script that does what you need, in this case, capturing the local administrator information. In these days of easy content replication, I have found this neat script from Richard Mueller, written back in 2007, that seems to do the job. I have then added a few SCOM related lines, to make sure it would return the required information in way SCOM can understand (PropertyBag).

In order the capture the information, I have created a new class based on the Windows Computer class (you could say I have extended the Windows Class) and named it Fehse.AdminInventory.ExtendedComputer.

<ClassType ID=”Fehse.AdminInventory.ExtendedComputer” Accessibility=”Internal” Abstract=”false” Base=”Windows!Microsoft.Windows.Computer” Hosted=”false” Singleton=”false”>
         <Property ID=”Administrators” Type=”string” Key=”false” CaseSensitive=”false” Length=”4000″ MinLength=”0″ />
       </ClassType>

Notice I have made the field 4000 bytes long. Longer fields of the string type won’t be migrated into the DW. You can experiment with Richtext as well, if you are using the new SCOM schema.

In order to discover the class, I’m using the script itself to gather the data and populate the Administrators property. Here go some key parts:

image

Notes: this is a lab test, so I have set it to run every 2 minutes. Don’t do that in production!

Once the discovery runs, you should be able to see something like this:

image

Well, you can’t actually see a lot in this pic, right? Remember to add the Local Administrators column by Personalizing the view. I have noticed the script returns data for DCs, although DCs have no local administrators. You can likely ignore it

You can also report on that. The data is stored in a table in the DW DB. I will get back here to add the proper instructions shortly (SCOM works in mysterious  ways).

You can find the MP here.

Hope this helps!