Skip to main content

An easier way to work with the SQL Server Error Log

There are three main ways to access the SQL Error Log: via the SSMS GUI, via the system stored procedure sp_readerrorlog or the extended stored procedure xp_readerrorlog. The code below I have close to hand in a Snippet and I probably use it every day. It copies the contents of the SQL error log and optionally the SQL Agent error log to a temporary table. Once there you query it for specific information. This can be also be done by using parameters 3 and 4 of sp_readerrorlog but for me this is a little more flexible.

 /*
sp_readerrorlog can take 4 optional parameters, they are:
    1. The number of the error log file you want to read: 0 = current, 1 = Archive #1, 2 = Archive #2, etc...
    2. Log file type: 1 or NULL = error log, 2 = SQL Agent log
    3. Search string 1: String one you want to search for
    4. Search string 2: String two you want to search for to further refine the results
*/
----------------
-- Current SQL Error log
---------------
IF EXISTS (SELECT 1 FROM tempdb..sysobjects WHERE id=OBJECT_ID('tempdb..#ErrorLog'))
    DROP TABLE #ErrorLog

CREATE TABLE #ErrorLog (
       logdate DATETIME
       ,processinfo VARCHAR(200)
       ,info VARCHAR(8000)
       )
GO

INSERT INTO #ErrorLog
EXEC sp_readerrorlog 0
       ,1
GO

CREATE CLUSTERED INDEX ix_date ON #ErrorLog (logdate)
GO
   
-- Show the last 10,000 errors within the last 24 hours
SELECT TOP 10000
    logdate
    ,processinfo
    ,info
FROM #ErrorLog
WHERE logdate > DATEADD(HOUR, - 24, GETDATE())
ORDER BY 1 DESC

-----------------
-- Current SQL Agent log
-----------------
IF EXISTS (SELECT 1 FROM tempdb..sysobjects WHERE id=OBJECT_ID('tempdb..#ErrorLogAgent'))
    DROP TABLE #ErrorLogAgent

CREATE TABLE #ErrorLogAgent (
       logdate DATETIME
       ,processinfo VARCHAR(200)
       ,info VARCHAR(8000)
       )
GO

INSERT INTO #ErrorLogAgent
EXEC sp_readerrorlog 0
       ,2
GO

CREATE CLUSTERED INDEX ix_date ON #ErrorLogAgent (logdate)
GO

-- Show the last 10,000 errors within the last 24 hours
SELECT TOP 10000
    logdate
    ,processinfo
    ,info
FROM #ErrorLogAgent
WHERE logdate > DATEADD(HOUR, - 24, GETDATE())
ORDER BY 1 DESC



Comments

Popular posts from this blog

How to configure the SSAS service to use a Domain Account

NB Updating SPNs in AD is not for the faint hearted plus I got inconsistent results from different servers. Do so at your own risk! If you need the SSAS account on a SQL Server to use a domain account rather than the local “virtual” account “NT Service\MSSQLServerOLAPService”. You may think you just give the account login permissions to the server, perhaps give it sysadmin SQL permissions too. However, if you try and connect to SSAS  remotely  you may get this error: Authentication failed. (Microsoft.AnalysisService.AdomdClient) The target principal name is incorrect (Microsoft.AnalysisService.AdomdClient) From Microsoft: “A Service Principle Name (SPN) uniquely identifies a service instance in an Active Directory domain when Kerberos is used to mutually authenticate client and service identities. An SPN is associated with the logon account under which the service instance runs. For client applications connecting to Analysis Services via Kerberos authentic...

How to import a large xml file into SQL Server

(Or how to import the StackOverflow database into SQL Server) Introduction NB  This process can be generalised to import any large (>2G) xml file into SQL Server. Some SQL Server training you can find online including that by Brent Ozar uses the StackOverflow database for practice. The tables from it are available online for download in xml format. In the past it was possible to use the scripts found here, https://www.toadworld.com/platforms/sql-server/w/wiki/9466.how-to-import-the-stackoverflow-xml-into-sql-server , to import them but as each xml file is now over 2GB you will get an error like this when you try to execute them: Brent Ozar, has a link to SODDI.exe, https://github.com/BrentOzarULTD/soddi , which can import the files (I haven’t tried it) but it means downloading and importing eight tables: Badges, Comments, PostHistory, PostLinks, Posts, Tags, Users, and Votes tables which amounts to >30GB of compressed xml increasing to ~200GB when deco...

SAN performance testing using SQLIO

Introduction This document describes how to use Microsoft’s SQLIO to test disk/SAN performance. It is biased towards SQL Server – which uses primarily 64KB and 8KB data pages so I am running the tests using those cluster sizes, however, other sizes can be specified.  Download SQLIO from https://www.microsoft.com/en-gb/download/details.aspx?id=20163   SQLIO is a command line tool with no GUI so you need to open a command prompt at  C:\Program Files (x86)\SQLIO  after you have installed it. Configuration First of all edit param.txt so that you create the test file we will be using. The file needs to be bigger than the combined RAID and on-board disk caches. In this case we are using a 50GB file. The “ 2”  refers to the number of threads to use when testing, you don’t need to change this now. The “ 0x0”  value indicates that all CPUs should be used, which you probably don’t want to change either, “ #”  is a comment. The o...