Russ Slaten is an MP authoring expert and that created a custom URL monitoring solution. This is my favorite solution as it addresses one of the most common issues that I have often heard from customers. “Don’t alert me if the website is down for only one second. Alert me if the website had be down for say five minutes”. The solution is a little more difficult to setup initially but once you have it working adding and removing website is as easy as updating an excel spreadsheet. You can even have the spreadsheet sit on a common file server so that people who have no clue how SCOM works can setup URL monitors.
I recommend installing this in your test lab first to get familiar with how it works.
So lets get started. First I go to Russ’s blog and read the documentation and download the Management Pack.
**Note Russ had updated the solution since this post to target Resource Pools in SCOM 2012** Most of the configuration steps documented here still apply.
Download Updated MP Supporting Resouce pools: URLMonitoring.zip
I extract the files to see what’s in the zip file
There are two files in the zip package. The first one is the management pack. The second one is the CSV file for listing website to be monitored.
Now lets take a look at the management pack in MP Viewer
There are three monitors. One “monitors whether a website is up or down”. The second one “checks for a string in the response from an http request”. The third one “Monitors the Performance of a Website”. Seems easy enough
Let get this puppy installed.
I open the administration console and go Management Packs. Right click and select Import Management Packs.
Then I go to Add and select Add from disk. Click no to search for dependencies.
Now I browse to the folder that I extracted the files to and select the “Custom.Example.WebsiteWatcher.xml” file
I click Install, and when its finished click Close
I have the Management Pack installed. Now I need to configure it for my environment. To keep it simple I am going to configure all my websites to be monitored from my RMS server. This way I can avoid messing with run as accounts.
First I am going to copy the CSV file to a simple location on the RMS. I created a folder on the RMS called c:\websitemonitoring and copied the “URLMonitoringList.csv” file there.
I open up the Operations Manager console and go to the Authoring Console and select Object Discoveries
Now I need to change my scope
I select Clear All at the bottom and then select View all targets.
Now I type in Website Watcher and click Select All then ok
As I seen before with the MP Viewer tool there are two discoveries in this management pack
I double click on the Discover Websites Discovery
I go to the configuration tab and I can see the discovery contains an interval on how often websites are discovered and a path to the file.
I click the edit button and copy and paste the location of my URLMonitoringList.csv which is “C:\websitemonitoring\URLMonitoringList.csv”
I am also going to change the interval to every 300 seconds for testing purposes. Once I have my everything working I will change it back to one day or 86400 seconds
I click Ok and then Ok again.
Now I need to configure what server I will be monitoring the websites from. In this case I will just be using the RMS.
I remote desktop into into the RMS and create a registry Key (not value) in HKLM\Software called “WebsiteWatcher”
I go the registry by typing regedit
Navigate to HKEY_LOCALMACHINE\Software
I right click and select New, Key
Then type in “WebsiteWatcher”
The discovery for this is set at once a day, while on the RMS I also cycled the System Center Management Services to speed up the discovery process. (I don’t recommend this in production)
Now I need to open the URLMonitoringList.csv and edit with excel. I set second column to my RMS server and save it.
Now I go to the Monitoring console and go to the Discovered Inventory view and select View All target and type in website.
I can see all the website have been discovered and are now being monitored.