Monday, November 26, 2012

SCOM 2012 - Database Grooming

So I get asked fairly regularly what database grooming settings should be. How long to keep the Operational and Data Warehouse data. How to configure the Ops database is fairly simple and can be done from the management console. In the Administration space, under Settings, Double Click on Database Grooming. The Global Management Group Settings window will appear. These are the default settings initially provided during the SCOM instillation.

When you do your initial research with the client and determine what settings they want for data retention I would urge them to maintain 7 days of operational retention with the Performance signature set to no more than 2 days. When data is transferred to the data warehouse they do not lose any of the granular nature of the information so there is no reason to not purge the Ops database frequently. This will help improve the overall health of SCOM as well as improve performance.

As far as data warehouse you are not limited by anything other than the size of the database on how long you can maintain retention.You can check and see what your current settings are on your data warehouse by running the following SQL command on your DW instance:
SELECT AggregationIntervalDurationMinutes, BuildAggregationStoredProcedureName, GroomStoredProcedureName, MaxDataAgeDays, GroomingIntervalMinutes FROM StandardDatasetAggregation
Your output should look like the following:
The column to focus on is MaxDataAgeDays. Several of the items are set to 400 days or ~13 months. Originally if you wanted to adjust these settings you needed to do so right out of the SQL tables. Microsoft has since created a command line tool called the Data Warehouse Data Retention Policy tool or dwdatarp.exe which is available for download from Microsoft.

I would spend some time getting familiar with this tool and with the different datasets that exist in SCOM. You will want to take this information and discuss with your client what data is of value to them. You may get the answer "all of it is important". If this is the case I would be clear with them on what each dataset is and if they capture all of it for X days this can dramatically change the amount of space required. Kevin Holman has a great blog on what all this tool can do.

This is a good example of doing a little planning up front with the client will save them a lot of headaches down the road long after you are gone.

More to come!

1 comment:

  1. Everything is very open with a precise explanation of
    the issues. It was really informative. Your website is very useful.
    Many thanks for sharing!