Skip to main content

Fun and games with the Management Data Warehouse (MDW and Data Collectors)

The SQL Server Management Data Warehouse (when you first come across it) seems to promise so much if the verbiage from Microsoft and some other websites is to to believed. But when you install it you may find that it is not as useful as it could be. This is a shame but we are currently only on v2 of the product with SQL 2012 so one hopes it will improve in subsequent versions.

However, it probably is worth playing with if you have never used it before - at least you can show your boss some reports on general server health when he asks for it and you have nothing else in place.

There is one big problem with it though if you decide that you don't want to use it any more, uninstalling it is not supported! Mad, I know. But as usual some very helpful people in the community have worked out, what seems to me, a pretty safe way of doing it.

I had a problem with my MDW. The data collector jobs were causing a lot of deadlocking on some production servers and impacting performance. It looks like there may be a workaround for it now but due to time constraints I didn't have the opportunity to investigate further so I disabled the associated SQL Agent jobs on the monitored servers. I thought I would revisit MDW in the future as it looked unlikely that my department were going to buy a 3rd party application that would offer similar functionality.

Some time later I noticed that the MDW database had grown very large, about 136G, and the server I had it running on was struggling for space. This was odd because I thought data wasn't being uploaded to it.

There is purge job that is installed by default called mdw_purge_data_[MDW] but though it was running each day didn't seem to clear much data out so I could shrink the file. Looking at the SP that is run by this job, core.sp_purge_data, it took some parameters so I thought I would try and execute it for each monitored instance. The parameters are:

@retention_days
@instance_name
@collection_set_uid
@duration

Making an educated guess that I could set @retention_days and @duration to 1, all I had to do was find the values for @instance_name and @collection_set_uid. To do so I ran this query against the MDW database:

--Show the databases that have been configured for MDW
SELECT DISTINCT [instance_name]
       ,[collection_set_uid]
FROM [MDW].[core].[snapshots]

I used the results as parameters for the the core.sp_purge_data stored procedure but even though it appeared to delete several thousand rows, not much space was released . Meanwhile, the database kept growing. There was only one thing for it - I had to "uninstall" MDW and set it up again. The MSSQLTips.com  article did the job for me on this 2008R2 server.

I have reinstalled MDW now and am only monitoring one server for the moment while keeping an eye on the amount of data that is being collected. But to help out matters I have enabled page compression on some tables that I identified previously which had grown very large. They are:

snapshots.active_sessions_and_requests
snapshots.io_virtual_file_stats
snapshots.notable_query_plan
snapshots.notable_query_text
snapshots.os_memory_clerks
snapshots.os_wait_stats
snapshots.performance_counter_values
snapshots.query_stats

To generate that list of tables I used the following query:

SELECT SCHEMA_NAME(sys.objects.schema_id) AS [SchemaName]
       ,OBJECT_NAME(sys.objects.object_id) AS [ObjectName]
       ,[rows]
       ,[data_compression_desc]
       ,[index_id] AS [IndexID_on_Table]
FROM sys.partitions
INNER JOIN sys.objects
       ON sys.partitions.object_id = sys.objects.object_id
WHERE data_compression > 0
       AND SCHEMA_NAME(sys.objects.schema_id) <> 'SYS'
ORDER BY SchemaName
       ,ObjectName

We are now rolling out SCOM so it will be interesting to see what sort of statistics I will be able to generate from it. The reports won't be as useful as those from Confio Ignite, for instance, but it will have to do for now.

Comments

  1. Hey Paulie,

    Excellent article and hopefully, better things to come for MDW in 2014+

    ReplyDelete

Post a Comment

Popular posts from this blog

How to create a custom Windows Event Log view and email trigger

The filtering on Windows event logs can be slow, clunky and although you can do it on fields like event ID, it seems that many event IDs are shared amongst many different errors – the event ID may match but the body of the error (therefore the actual error) may be completely unrelated. Fortunately, it is possible to filter on the contents of the body of the error message but it requires creating a custom XML query. Also, it would be handy to send out a notification email when this event gets logged. Read on to find out how to work this magic…. This example is looking for a  Warning  event  1309  for  ASP.NET 4.0.30319.0  on a web server. If you were to just filter the log on the criteria above today it would return 435 results because it is a fairly general error ID. If I filter it using XML for SqlException (what I’m really interested in) only 5 results are returned. So the first step is go to the Application Log and choose  Create Custom View… ...

How to configure the SSAS service to use a Domain Account

NB Updating SPNs in AD is not for the faint hearted plus I got inconsistent results from different servers. Do so at your own risk! If you need the SSAS account on a SQL Server to use a domain account rather than the local “virtual” account “NT Service\MSSQLServerOLAPService”. You may think you just give the account login permissions to the server, perhaps give it sysadmin SQL permissions too. However, if you try and connect to SSAS  remotely  you may get this error: Authentication failed. (Microsoft.AnalysisService.AdomdClient) The target principal name is incorrect (Microsoft.AnalysisService.AdomdClient) From Microsoft: “A Service Principle Name (SPN) uniquely identifies a service instance in an Active Directory domain when Kerberos is used to mutually authenticate client and service identities. An SPN is associated with the logon account under which the service instance runs. For client applications connecting to Analysis Services via Kerberos authentic...

How to import a large xml file into SQL Server

(Or how to import the StackOverflow database into SQL Server) Introduction NB  This process can be generalised to import any large (>2G) xml file into SQL Server. Some SQL Server training you can find online including that by Brent Ozar uses the StackOverflow database for practice. The tables from it are available online for download in xml format. In the past it was possible to use the scripts found here, https://www.toadworld.com/platforms/sql-server/w/wiki/9466.how-to-import-the-stackoverflow-xml-into-sql-server , to import them but as each xml file is now over 2GB you will get an error like this when you try to execute them: Brent Ozar, has a link to SODDI.exe, https://github.com/BrentOzarULTD/soddi , which can import the files (I haven’t tried it) but it means downloading and importing eight tables: Badges, Comments, PostHistory, PostLinks, Posts, Tags, Users, and Votes tables which amounts to >30GB of compressed xml increasing to ~200GB when deco...