SCOM – Watching a log file for changes w/ PowerShell

Dunno what it is, but a lot of SCOM information STILL persists on the web in VB Script and refering to SCOM 2007, when 2012 R2 has been out for YEARS now!

In this blog post, I’ll walk you through monitoring a log file for changes, and throwing an alert if the log file hasn’t changed.  And we’ll do it in PowerShell, as the good saint Snover intended.

How to make a two-state monitor in PowerShell

We’re gonna make a monitor, so launch SCOM, and go to Authoring->Management Pack Objects, Monitors.  If you don’t see the Authoring tab, you’ve got a baby account and need some added perms.

Right click Monitors and choose add new Unit Monitor.


Pick Scripting – Generic, Timed Script Two State Monitor and find a Management Pack to put this bad boy in.


Give it a name and a description, and then target it to Windows Server Operating System.

FOR GOD’s SAKE DON’T CLICK MONITOR IS ENABLED.  If you do this, every instance of Windows Server OS in your company is going to start running this script.  You probably don’t want that, and instead only want one or two machines to run the script.


Instead, we’re going to create this in a disabled state, then override it to on in order to pick a single Server OS to be our watcher.  This means we’ll pick a PC, and it will run this script for us in whatever frequency we specify later.

Since SCOM is ancient, it still defaults to having an example VBScript, pretty silly, actually.

You can put in your own name up top, I’ll call this “MonitorForTextFileChanges.Ps1′

First, we can provide a param to the script to execute. This is a good idea rather than dumping it into the body of the script, because we can create overrides to later re-use this and target other systems, watching other log files for changes!

Here’s how I’m providing the path to my file. Click Parameters at the bottom:


$FileNetLog = \\someserver\Somefile.txt

Now, for the actual code.

$API = new-object -comObject "MOM.ScriptAPI" 
$PropertyBag = $API.CreatePropertyBag()
$LogFile = get-item $FileNetLog

if ($LogFile.LastWriteTime -ge (get-date).AddMinutes(-15)){
    #good, the file has changed, lets emit a positive object
    $PropertyBag.addValue("State", "OK") 
    #no is good, the file hasn't changed, let's emit a fail
    $PropertyBag.addValue("State", "ERROR")   



We hook into the SCOM Scripting host to make a property bag, which is a gross sounding thing that SCOM counts on to understand what’s happening when a monitor is running. This bag will return one property, called State which is specified on either line 7 or 11, and has a value of OK or ERROR.

The code is simple. Create a reference to a file and call it $logFile, then use a simple If Greater or Equal to see if the $logFile.LastWriteTime is greater than this moment, 15 minutes ago. If it IS, return STATE:OK, if it ain’t, return STATE:ERROR. Finally, echo out the $propertyBag so that SCOM knows we’re done with the script.

Finally, let’s tell SCOM how to interpret the values this will be throwing. Proceed on to ‘Unhealthy Expression’.

In this window, we need to tell SCOM which value will be an unhealthy state for this monitor. The syntax is a bit odd. Click Parameter Name and put in Property[@Name='State'], then set the Operator to Equals, and finally for the value, put in ERROR, as seen below.


Next, do the same for Healthy Expression, but set the value to be OK, as seen here.


Finally, verify that the Health Icons make sense, and then proceed to Alerts if you want to throw alerts when this monitor fails. You probably do, even though we disabled the monitor earlier (we DO want to configure alerts so that when we over-ride this to ‘ON’ for certain servers, we’ll get alerts and not have to come revisit this process)


And that’s it! The final step is to simply find your monitor in the list, and then override it to enabled for a single instance of the Windows Server Operating System class, and just pick the server you’d like to monitor for changes.

Make sure to pick ‘a Specific Object of this class’

This will let us pick a single computer to run this monitor.

Ugliest ever way to censor computer names FOUND!
Ugliest ever way to censor computer names FOUND!


Make sure to check the box to turn it on!

Then click the box for ‘Override: enabled’, and pick the over ride value of True. Finally, save it in a management pack and you’re golden!


Couldn’t have figured this out without the help of these posts!



13 thoughts on “SCOM – Watching a log file for changes w/ PowerShell

  1. skatterbrainzz February 19, 2016 / 5:53 pm

    Nice article! The two most common reasons I’ve seen for lack of 2012 R2 updated articles is either customers have been slow to leave 2007, or have upgraded but still use older methods like VBscript.

  2. Nithin KG April 22, 2016 / 7:56 am

    This method does not work for me, I have configured a simple two state monitor. I get this error: “The process started at 6:33:49 AM failed to create System.PropertyBagData, no errors detected in the output. The process exited with 1 ” . Do I need to do any additional setting for Datasourse to get Powershell to execute.

    By default the script runs as vbs in scom on cscript.exe process.

    I have imported the following custom MP and successfully configured this. but why scom is not allowing Powershell script to execute….

    • FoxDeploy April 22, 2016 / 8:05 am

      Make sure you don’t have any smart quotes (they would be ” symbols which are slanted).

      • Nithin KG April 22, 2016 / 9:14 am

        Thank you for the response, I did not understand the smart quotes you are referring to. Should I not use any smart quotes in Powershell script?

        I belive SCOM takes the script as vbs by default though I named my scriptname as myscript.ps1.

        I’m using SCOM 2012 R2 UR8.

    • skatterbrainzz April 22, 2016 / 4:38 pm

      Part of the reason for the delay in catching up with regards to PowerShell script for scom is SMA and Azure automation coming down the pipe.

  3. Michael July 25, 2016 / 8:30 am

    Hi Guys

    Anyone had the following error messages in the logs before?
    failed to create System.PropertyBagData, no errors detected in the output. The process exited with 1


  4. skatterbrainzz July 29, 2016 / 8:42 pm

    After thinking about this scenario again, I would approach it conditionally. If the customer owns a System Center license, or they don’t. If they don’t, then this approach is fantastic. If they do, I’d recommend at least giving Orchestrator a try, since this scenario is one of the common built-in actions which can be dragged onto the workspace to make a Runbook. And given the additional tools Orchestrator provides (as-is and via Integration Packs) it’s woefully underused in most shops.

    • FoxDeploy July 29, 2016 / 8:49 pm

      Orchestrator seemed awesome… But it’s debugging and development story is absolute garbage. I’ve spent weeks of time tracking down issues in scorch which I could have pinned down easily in a script in a day.

      Its a zombie product and worst kind of zombie product, as everyone knows it’s dead but it won’t admit it.

      SMA is a powerful alternative, but it eschews tooling entirely. In the end, I feel that I’m almost better off writing a workflow and deploying that to a cluster.

      • skatterbrainzz July 29, 2016 / 9:02 pm

        It really comes down to scale. For example, reacting to a log file change with a single task or a linear task sequence. Or, triggering a complex, conditional branched asynchronous parallel chain of sequences (multiple products, databases, environments, etc.) For fairly simple tasks scripting is great and I use it all the time. Orchestrator isn’t dead however. SMA is nice, but not quite there yet on usability side. When the cloud and on-prem worlds are more tightly integrated, and the cost models are simplified more, and customers finally get over that FUD about the whole platform shift (some have, many more have not) then SMA (or whatever it evolves into) will take over I agree. Most shops today don’t even use PS WF or DSC beyond a lab or proof of concept. The problem stems from staffing constraints, which rippled into fire-extinguishing mode, which means staff can’t relax long enough to learn new tricks. They spend more time updating their resumes. I digress. I need another beer.

        • FoxDeploy July 30, 2016 / 12:48 am

          Nah man, preach it! This should be the theme of a whole post.

  5. David D. October 12, 2016 / 9:08 am

    I am attempting to follow your instructions to run a .ps1 file, but I’m getting an error as shown below. My question is can cscript.exe run a PowerShell script? Here is the command being run in evenvwr:

    Command executed: “C:\WINDOWS\system32\cscript.exe” /nologo “SQLGuideSearchQuery.ps1”
    Working Directory: C:\Program Files\Microsoft Monitoring Agent\Agent\Health Service State\Monitoring Host Temporary Files 1\4767\

    • FoxDeploy October 12, 2016 / 9:26 am

      CScript is for running VBScript files. You should call Powershell.exe -file c:\pathto\YourScript.ps1 instead

      • David D. October 12, 2016 / 2:35 pm

        I have search for anything to point me in the right direction to call PowerShell. Do you have any information on how to accomplish this? I opened up the Monitor I had created and I do not see a place to change from cscript to PowerShell.

        Thank you,
        David D.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s