tag:blogger.com,1999:blog-254072762024-02-08T11:27:05.714+01:00Killspid's blog...my own personal experience with SQL Server...killspidhttp://www.blogger.com/profile/12595181397763193101noreply@blogger.comBlogger29125tag:blogger.com,1999:blog-25407276.post-77610973552553699352009-04-15T20:05:00.003+02:002009-04-15T20:14:57.735+02:00"Troubleshooting Performance Problems in SQL Server 2008"After the excellent white papers that were written for <a href="http://killspid.blogspot.com/2006/05/performance-tuning-guide-for-sql.html">the previous versions</a> and with all the improvements that have been made in Sql Server 2008 in this area, I'm looking forward to read the new edition in the Troubleshooting Performance Problems series. You can find a link to download it <a href="http://msdn.microsoft.com/en-us/library/dd672789.aspx">here</a>.killspidhttp://www.blogger.com/profile/12595181397763193101noreply@blogger.com0tag:blogger.com,1999:blog-25407276.post-64881936749801202302009-03-31T23:26:00.004+02:002009-03-31T23:45:39.318+02:00SSRS configuring a rendering extensionReporting Services has a default rendering extension to create csv files with a comma as the default fielddelimiter. But I needed to have the possibility to create a csv with semicolon delimited fields. It's easy to change this behaviour - and many others - when calling the Render method or by adding some settings to the URL. But I wanted that file delivered through a subscription and that makes it a lot more difficult.<br /><br />Just add the following snippet to the Render section of the RSReportServer.config file and you'll have a new option to render your subscriptions.<br /><br /><pre style="font-family: Andale Mono, Lucida Console, Monaco, fixed, monospace; color: #000000; background-color: #eee;font-size: 12px;border: 1px dashed #999999;line-height: 14px;padding: 5px; overflow: auto; width: 100%"><code><Extension Name="CSVSemicolon" Type="Microsoft.ReportingServices.Rendering.CsvRenderer.CsvReport,Microsoft.ReportingServices.CsvRendering"><br /> <OverrideNames><br /> <Name Language="en-US">CSV (semicolon delimited)</Name><br /> </OverrideNames><br /> <Configuration><br /> <DeviceInfo><br /> <FieldDelimiter>;</FieldDelimiter><br /> </DeviceInfo><br /> </Configuration><br /></Extension><br /><br /></code></pre>killspidhttp://www.blogger.com/profile/12595181397763193101noreply@blogger.com0tag:blogger.com,1999:blog-25407276.post-7799094793997606782008-05-22T20:19:00.004+02:002008-05-22T20:43:50.435+02:00Arrivederci W.<a href="http://dis4ea.blogspot.com/">Dis4ea</a> is dead, long live <a href="http://blogs.msdn.com/wesleyb/">WesleyB</a>!<br /><br />Last week my favorite colleague - you know the one :) - left his company to launch his career into a whole new, interesting direction.<br />His old blog tells the story of his four years at our company as a database architect. It won't get updated anymore but many posts are still as relevant now as at the time they were posted - it's his long term memory after all. And there is the excellent blogroll of course.<br />The content of his new blog will surely treat topics differently (the host's always watching), but I hope he will continue to produce quality.<br /><br />Enough said... success!killspidhttp://www.blogger.com/profile/12595181397763193101noreply@blogger.com2tag:blogger.com,1999:blog-25407276.post-30950936685773530112008-02-20T20:43:00.009+01:002008-02-25T12:26:38.950+01:00Dive into msdb: BackupSet & RestoreHistorySimple question today: can we find out how recent the backup is that has been used to restore our production copy?<br /><br /><p class="MsoNormal" style="MARGIN-BOTTOM: 0pt; LINE-HEIGHT: normal"><span lang="EN-GB"></span></p><blockquote><p class="MsoNormal" style="MARGIN-BOTTOM: 0pt; LINE-HEIGHT: normal"><span lang="EN-GB" style="font-family:'Courier New';font-size:10;color:blue;">select</span><span lang="EN-GB" style="font-family:'Courier New';font-size:10;"> <span style="color:blue;">top</span> 1 b<span style="color:gray;">.</span>backup_finish_date</span><span lang="EN-US" style="font-family:'Times New Roman','serif';font-size:12;"><?xml:namespace prefix = o /><o:p></o:p></span></p><p class="MsoNormal" style="MARGIN-BOTTOM: 0pt; LINE-HEIGHT: normal"><span lang="EN-GB" style="font-family:'Courier New';font-size:10;color:blue;">from</span><span lang="EN-GB" style="font-family:'Courier New';font-size:10;"> msdb<span style="color:gray;">..</span>backupset b</span><span lang="EN-US" style="font-family:'Times New Roman','serif';font-size:12;"><o:p></o:p></span></p><p class="MsoNormal" style="MARGIN-BOTTOM: 0pt; LINE-HEIGHT: normal"><span lang="EN-GB" style="font-family:'Courier New';font-size:10;color:gray;">join</span><span lang="EN-GB" style="font-family:'Courier New';font-size:10;"> msdb<span style="color:gray;">..</span>restorehistory r <span style="color:blue;">on</span> b<span style="color:gray;">.</span>backup_set_id <span style="color:gray;">=</span> r<span style="color:gray;">.</span>backup_set_id</span><span lang="EN-US" style="font-family:'Times New Roman','serif';font-size:12;"><o:p></o:p></span></p><p class="MsoNormal" style="MARGIN-BOTTOM: 0pt; LINE-HEIGHT: normal"><span lang="EN-GB" style="font-family:'Courier New';font-size:10;color:blue;">where</span><span lang="EN-GB" style="font-family:'Courier New';font-size:10;"> r<span style="color:gray;">.</span>destination_database_name <span style="color:gray;">=</span> <span style="color:red;">'MyDb'</span></span><span lang="EN-US" style="font-family:'Times New Roman','serif';font-size:12;"><o:p></o:p></span></p><p class="MsoNormal" style="MARGIN-BOTTOM: 0pt; LINE-HEIGHT: normal"><span lang="EN-GB" style="font-family:'Courier New';font-size:10;color:blue;">order</span><span lang="EN-GB" style="font-family:'Courier New';font-size:10;"> <span style="color:blue;">by</span> r<span style="color:gray;">.</span>restore_date <span style="color:blue;">desc</span></span><span lang="EN-US" style="font-family:'Times New Roman','serif';font-size:12;"><o:p></o:p></span></p><p class="MsoNormal" style="MARGIN-BOTTOM: 0pt; LINE-HEIGHT: normal"><br /><span lang="EN-US"><o:p></o:p></span></p></blockquote><p class="MsoNormal" style="MARGIN-BOTTOM: 0pt; LINE-HEIGHT: normal"><span lang="EN-GB"><span style="color:blue;"></span></span><span lang="EN-US"><o:p></o:p></span></p>killspidhttp://www.blogger.com/profile/12595181397763193101noreply@blogger.com0tag:blogger.com,1999:blog-25407276.post-85627782859700204562008-01-08T22:16:00.000+01:002008-01-08T22:55:50.127+01:00Transaction log files after reverting from snapshotWatch out with the transaction log file(s) when you revert a database from a snapshot! A snapshot database has no transaction log itself and when you revert the original database to the snapshot database, it is documented that Sql Server rebuilds the log. It does however not reuse the characteristics of the original transaction log file(s) as one could expect. The actual and maximum size, the autogrowth property and even the number of files are not necessary the same as before reverting. This could get you easily into trouble.<br /><br />Lets illustrate this with an example, by creating a database SnapshotTest with 2 transaction log files, which will probably not be the same number as is the case for the model database. A snapshot of SnapshotTest with the name SnapshotTest_ss gets created.<br />Now for the interesting stuff: we revert the database from the snapshot and we investigate the transaction log files as we did earlier. Only the first transaction log file will be present for the SnapshotTest database but not with the same characteristics as before and not even the ones of model's transaction log. The second file is no longer part of the database. Is this desired behavior?<br /><br /><blockquote>USE [master]<br />GO<br />CREATE DATABASE SnapshotTest<br /> ON (NAME = SnapshotTest, FILENAME = 'C:\TEMP\SnapshotTest.MDF')<br /> LOG ON (NAME = SnapshotTestLog1, FILENAME = 'C:\TEMP\SnapshotTestLog1.LDF')<br /> , (NAME = SnapshotTestLog2, FILENAME = 'C:\TEMP\SnapshotTestLog2.LDF')<br />GO<br />EXEC sp_helpdb SnapshotTest<br />GO<br />CREATE DATABASE SnapshotTest_ss ON (NAME = SnapshotTest, FILENAME = 'C:\TEMP\SnapshotTest_ss.MDF')<br /> AS SNAPSHOT OF SnapshotTest<br />GO<br /><br />RESTORE DATABASE SnapshotTest <br /> FROM DATABASE_SNAPSHOT = 'SnapshotTest_ss'<br />GO<br />EXEC sp_helpdb SnapshotTest<br />GO<br />USE [master]<br />GO<br />DROP DATABASE SnapshotTest_ss<br />GO<br />DROP DATABASE SnapshotTest</blockquote><br />Don't forget to clean up the database file that is left behind on the file system - it's no longer part of a database so it's not removed when you drop the database.killspidhttp://www.blogger.com/profile/12595181397763193101noreply@blogger.com0tag:blogger.com,1999:blog-25407276.post-60204799639067505282007-12-05T23:32:00.000+01:002007-12-05T23:54:21.001+01:00Focusing on SSISSince our migration to Sql Server 2005 a couple of months ago, Integration Services has always been a constant point of attention to our database development team. That's mainly because of the necessary and upcoming migration of the existing DTS packages, but also because of the constant trouble we experience with the technology.<br /><br />Yep, we like it just as much as we hate it at times. Some random stuff from the past weeks...<br /><br />* SSIS is in a lot of ways much stricter than DTS, certainly when it comes to data transfer tasks. Encoding of files didn't bother us in the past, but now we have to be very aware of how we handle and manipulate our imports. A data conversion task gave me some hastle when it discovered a special character in an import which caused an error concerning the truncation of data for that field. Although the content wasn't exceeding the length of the field, it appeared that the task couldn't correctly determine the length, so I had to ignore the so-called truncation in the error handling options.<br /><br />* On our main production cluster we had to deploy a package. We store our packages in Sql Server (msdb) at this time, but we just did not succeed in the import on the server (with all the SP's and CU's installed). It produced the following error:<br /><br /><blockquote><span style=";font-family:Arial;font-size:85%;" ><span style="" lang="EN-GB"></span></span>Element not found. (Exception from HRESULT: 0x8002802B<br />(TYPE_E_ELEMENTNOTFOUND)) (Microsoft.SqlServer.DTSRuntimeWrap) </blockquote><br />In the end, we had to import the package using the client (SSMS) from another server.killspidhttp://www.blogger.com/profile/12595181397763193101noreply@blogger.com0tag:blogger.com,1999:blog-25407276.post-88628179553281949032007-11-27T01:00:00.000+01:002007-12-06T19:03:41.255+01:00Database snapshot filesBe aware of the fact that a database snapshot contains all the data of the referenced database at a certain moment. Not only the actual data, but also the metadata, the bookkeeping of Sql Server, is present in the snapshot.<br /><br />This can give some unexpected results when you query tables like sys.database_files for example. Instead of getting the set of files the snapshot exists of, you get the info that was originally in the referenced database, being the files of that database, not the snapshot. That information and much more is available through master.sys.master_files.<span style="font-weight: bold;"></span>killspidhttp://www.blogger.com/profile/12595181397763193101noreply@blogger.com0tag:blogger.com,1999:blog-25407276.post-6594978052579979332007-10-16T00:21:00.000+02:002007-10-16T00:48:25.028+02:00SSIS, 64-bit and a breakpointDespite of 2 service packs and the numerous hotfixes we installed since the release of Sql Server 2005, we still discover problems with SSIS much too often. This time, it was some weird message in our acceptance environment telling me the "Script files failed to load". I had thoroughly tested the package on our development environment, so I was amazed to hear it failed on another environment.<br /><br /> A quick search on the specialized forums returned some hits - mostly related to the Script task the package contained - but nothing really described the conditions I faced. The only difference I could think of was the fact that the acceptance environment was a 64-bit edition, unlike development. In such a case, the settings of the Script task are quite simple: the PrecompileScriptIntoBinaryCode flag should be on.<br /><br />There was a non-active breakpoint however that caught my attention which I hadn't noticed before because the Script task was not part of my changes to the package and the tests had never halted at that break. After removing the breakpoint, the package effectively ran without a problem.<br /><br />It's a shame we had to discover this bug the hard way, but it proves that an acceptance environment should stick as close to production as possible. In this case, it payed off.Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-25407276.post-80957336669928126422007-10-15T19:08:00.000+02:002007-10-15T19:12:06.877+02:00SSIS's interaction with DTCI encountered some locking problems while developing an SSIS package today. The purpose of the package is to upload new content from flat file to a table. It contains basically two control flow tasks - one clearing the table (TRUNCATE TABLE) and another one filling it with a data flow task - that require a transaction. But, while executing, the package blocked at the start of the data flow task.<br /><br />Investigating the blocking, I encountered some SPID -2 owned locks in sys.dm_tran_locks, thus blocking the data flow task from retrieving the necessary metadata from the database at the PreExecute event. I fiddled around with the setting of the TransactionOption and IsolationLevel of the different components, but nothing resolved my problem adequately. With the help of profiler however, I could determine that SSIS did not propagate the DTC transaction - started for the SQL task - to the Data Flow task.<br /><br />The blocking can be resolved by either changing the ValidateExternalMetadata attribute of the OLE DB Destination or changing the TRUNCATE TABLE of the SQL task into a DELETE statement.<br /><br />The reason why both of these solutions have such an impact remains unclear to me. It's true that each of these modifications affect the locks that are required by the package, but that doesn't explain the fact that the Data Flow task was not enlisted appropriately into MSDTC.Unknownnoreply@blogger.com2tag:blogger.com,1999:blog-25407276.post-56873602121425378142007-09-13T21:38:00.000+02:002007-09-13T21:48:11.314+02:00sys.indexes and table valued functionsToday, while lending a hand in the creation of a new operational reindexing job, we discovered the existence of primary keys on table valued functions. I had written a little script to gather all the clustered and non-clustered indexes with their necessary attributes from sys.indexes. I assumed that indexes are only present on physical tables, so I didn't made any checks on the type of object it was associated with. In our first test-run, that turned out to be a mistake.<br /><br />In the definition of the return value of a table valued function, you can define the primary key for that table and although it isn't an actual table - just a table variable created at run-time - metadata about the index is kept similar to any user table.<br /><br /><blockquote>USE tempdb<br />GO<br />CREATE FUNCTION dbo.udf_TableValuedFunction ()<br />RETURNS @Table TABLE ( id int PRIMARY KEY , comment varchar(200) )<br />AS<br />BEGIN<br /> INSERT @Table VALUES ( 1, 'comment 1' )<br /> INSERT @Table VALUES ( 2, 'comment 2' )<br /> RETURN<br />END<br />GO<br />SELECT *<br />FROM sys.objects o<br />JOIN sys.indexes i ON i.object_id = o.object_id<br />WHERE o.name = 'udf_TableValuedFunction'<br />GO<br />DROP FUNCTION dbo.udf_TableValuedFunction<br />GO</blockquote>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-25407276.post-80705982804858143372007-05-24T23:19:00.000+02:002007-05-24T23:25:23.367+02:00Unexpected dumpHad some fun with SQL Server 2000 (SP4 + hotfixes) today. While tuning a certain stored procedure, I made a stupid variation with an obvious error (wrong default value for a procedure parameter). It returned no problem on creation, it just dumps the connection on executing (and thus compiling) the procedure in a very nasty way. Don’t try this on production systems ;-). Luckily, this problem gets intercepted in Sql Server 2005 at runtime.<br /><br />USE TempDB<br />GO<br />CREATE PROC proc_error<br /> @Date DATETIME = blabla --no good!<br />AS<br /> SELECT @Date + 1<br />GO<br />EXEC proc_error<br />GO<br />DROP PROC proc_error<br />GO<br /><br /><span style="color: rgb(153, 0, 0);">Msg 0, Level 19, State 0, Procedure proc_error, Line 4</span><br /><span style="color: rgb(153, 0, 0);">SqlDumpExceptionHandler: Process 65 generated fatal exception c0000005 EXCEPTION_ACCESS_VIOLATION. SQL Server is terminating this process.</span>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-25407276.post-80657274763118888552007-05-13T18:23:00.000+02:002007-05-23T13:29:50.595+02:00Plan GuidesIn our production environment, we had a poorly performing stored procedure in terms of reads and CPU usage. A quick peek using the available dmv's gave us an insight into the plan and the parameters that were used to obtain that plan. That way, it was easy to simulate the behavior on a development environment.<br /><br />Apparently, every morning, with a clean procedure cache, the procedure got called with certain parameters which resulted in a not very intuitive, but good plan with a cost of 3 reads. Subsequent calls with different parameters reused the cached plan, but that resulted in > 8000 logical reads and quite some CPU usage. A more intuitive plan resulting from typical parameters had a constant cost of about 8 logical reads for both types of parameters, which can give a really big performance boost, given the fact that the procedure gets called more than 40.000 times a day and delays a vital business process.<br /><br />Because I'm not allowed to modify anything of the stored procedure (yep, it's CRM), I had to use a plan guide to 'guide' the engine to the more performant plan. There are several options: specifying the whole plan, add the parameters for the compilation, force recompile... I chose the parameter hint.<br /><br />--solution, add plan guide<br />exec sp_create_plan_guide<br />@name = N'Plan for procedure p_GrantInheritedAccess UPDATE'<br />, @stmt = N'UPDATE poa1<br />SET InheritedAccessRightsMask = poa1.InheritedAccessRightsMask poa2.AccessRightsMask poa2.InheritedAccessRightsMask 0x08000000<br />FROM PrincipalObjectAccess poa1 JOIN PrincipalObjectAccess poa2 ON (poa1.PrincipalId = poa2.PrincipalId)<br />WHERE poa1.ObjectId = @ReferencingObjectId<br />AND poa1.ObjectTypeCode = @ReferencingObjectTypeCode<br />AND poa2.ObjectId = @ReferencedObjectId<br />AND poa2.ObjectTypeCode = @ReferencedObjectTypeCode'<br />, @type = N'OBJECT'<br />, @module_or_batch = N'p_GrantInheritedAccess'<br />, @params = NULL<br />, @hints = N'OPTION(OPTIMIZE FOR(@ReferencingObjectId=''D4F70FD5-41FA-DB11-9668-0011258C53FD'',@ReferencingObjectTypeCode=4200,@ReferencedObjectId=''2052793D-41FA-DB11-9668-0011258C53FD'',@ReferencedObjectTypeCode=4200))'<br />SELECT * FROM sys.plan_guides<br /><br />The result of this simple operation is a clean and transparent solution. I haven't put it into production though, because it stays a last resort, but it's always good to know your options.Unknownnoreply@blogger.com1tag:blogger.com,1999:blog-25407276.post-32012126147655137542007-05-09T12:40:00.000+02:002007-05-10T19:08:04.567+02:00MS CRM, from a database perspectiveSome time ago, I was appointed as the database dude for our customized MicroSoft CRM 3.0 implementation. The project had already been put into production, but there was a major concern about the performance of the application, mainly on the database. And that was the time when I came into the picture.<br /><br />We started the cleanup with resolving some major issues in the custom code. That code was written by rather inexperienced developers with little supervision on their creations. That resulted in some very interesting behavior on our production environment from time to time, at least if you're in for a challenge. The problems can be roughly categorized into two groups: locking issues and suboptimal plan usage, mostly due to wrong datatype usage.<br /><br />After the resolution of the most urgent issues on the customization, I got the chance to take a look on the actual MSCRM database. And it isn't always a pretty sight. Sure, the system of a controlling Metabase and a generic customizable CRM database are pretty neat, but the statements that I see passing in the profiler traces are often quite unmanageable. In exchange, the application itself offers very flexible functionality, but it doesn't always seem to scale that well to the data volumes we are handling in our company.<br /><br />Well, working with MS CRM 3.0 is proving to be an enriching experience. You actually shouldn't have to look at the database and you should definitely not temper with it, but Sql Server 2005 offers very interesting features to pinpoint the culprits and to help the engine a little in the right direction. More to come in hopefully a little more technical posts ;-).Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-25407276.post-1169762372532156992007-01-25T22:57:00.000+01:002007-01-25T23:20:58.653+01:00SSIS BypassPrepare<p class="MsoNormal">Yesterday, we stumbled into an annoying bug of the RTM version of SQL Server 2005. A new version of a previous project of mine was being released with a modified SSIS package. One of the modifications consists of the usage of a mapped input parameter in an Execute SQL Task. It requires the BypassPrepare flag of that task to be changed to True to make this construct work. It was succesfully tested in development, acceptance and then, in production, we suddenly got the following error.<br /></p><p class="MsoNormal"><blockquote>Executing the query "…" failed with the following error: "Syntax error, permission violation, or other nonspecific error". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.</blockquote></p> <p class="MsoNormal"><o:p> </o:p></p> <p class="MsoNormal">This error lists almost everything that can possibly go wrong with an Execute SQL Task without giving any indication what actually went wrong. Sigh.</p><p class="MsoNormal">At first I thought that the problem was caused by the fact that our production environment didn't had SP1 and the necessary hotfixes like it should have. That's an essential factor indeed, but there was more to the problem. The trouble with the RTM version is that it doesn’t take this BypassPrepare into account. Whether you set this flag to true or false in the development environment, if you look at the actual xml that represents the package, you won't find a trace of this setting. This was fixed in SP1 (as correctly answered on the <a href="http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=376422&SiteID=1">forums</a>).</p><p class="MsoNormal">What I didn't know was that our operational DBA did export and import the package with his local client (SSMS) on which he didn't have SP1 either. Similar to the behavior in Visual Studio, the setting gets lost with SSMS in RTM version. After a quick install and verification, everything worked smoothly.</p><p class="MsoNormal">In the upcoming SP2 it is <a href="http://download.microsoft.com/download/2/b/5/2b5e5d37-9b17-423d-bc8f-b11ecd4195b4/WhatsNewSQL2005SP2.htm">announced</a> that the default of this setting will be changed to True, meaning that SSIS won't prepare that statement unless you explicitly specify it do so. It could have saved us some trouble. Valuable lesson learned is to keep all environments, and also the clients, always up to date.<br /></p>Unknownnoreply@blogger.com2tag:blogger.com,1999:blog-25407276.post-1166728186119546082006-12-21T19:49:00.000+01:002006-12-21T20:09:46.136+01:00Difference between @@IDENTITY and SCOPE_IDENTITY()A colleague asked me today if he could use @@IDENTITY in the context of an insert stored procedure to return the generated value for an identity column. I recalled there was something about this function so I consulted the Books OnLine.<br />It states :<br /><p xmlns=""></p><blockquote><p xmlns="">@@IDENTITY, SCOPE_IDENTITY, and IDENT_CURRENT are similar functions because they all return the last value inserted into the IDENTITY column of a table.</p> <p xmlns="">@@IDENTITY and SCOPE_IDENTITY return the last identity value generated in any table in the current session. However, SCOPE_IDENTITY returns the value only within the current scope; @@IDENTITY is not limited to a specific scope.<blockquote></blockquote></p><blockquote></blockquote></blockquote>It took me a minute to find a good example to illustrate the difference between the two, but with a trigger I created the following example:<br /><br />USE TempDB<br />GO<br />CREATE TABLE tst<br />( a int identity(1,1), s varchar(10))<br />GO<br />CREATE TABLE tst2<br />( a int identity(1000,1), s varchar(10))<br />GO<br />CREATE TRIGGER dbo.trgTst<br /> ON tst<br /> AFTER INSERT<br />AS INSERT tst2 SELECT inserted.s FROM inserted<br />GO<br />INSERT tst VALUES('a')<br />SELECT<br />@@IDENTITY AS [@@IDENTITY],<br />SCOPE_IDENTITY() AS [SCOPE_IDENTITY()]<br />GO<br />DROP TABLE tst2<br />DROP TABLE tst<br /><br />SCOPE_IDENTITY() will give you the value of tst.$IDENTITY, ignoring the identity value that is generated for table tst2 with the trigger after the insert into tst. @@IDENTITY will give you that value from tst2.$IDENTITY.<br /><br />So the function we actually need in our case is not @@IDENTITY but SCOPE_IDENTITY().Unknownnoreply@blogger.com1tag:blogger.com,1999:blog-25407276.post-1164218734123153672006-11-22T18:41:00.000+01:002006-11-22T19:05:34.140+01:00$ROWGUIDWe are in the middle of optimizing a big staging database that contains literally billions of uniqueidentifier-integer pairs over several tables. The database is used in a process to convert the nice little GUIDs of our production environment into keys that the reporting environment on Oracle of our enterprise can more easily deal with. Don't ask why we do it - I guess we lost that strategic battle :(<br /><br />Anyway, when I was looking at some of the generic legacy code to identify the ROWGUID of a table, I remembered something about $ROWGUID. This automatically returns the ROWGUIDCOL of that table. Sadly, it's new in SQL Server 2005 and because we haven't had the chance to migrate that database, we still need to use some looping to investigate the COLUMNPROPERTY to finally get the name of the column that can be used in dynamic SQL to mimic the behavior of $ROWGUID:<br /><br />USE TempDB<br />GO<br />CREATE TABLE SomeTable<br />(<br />ID uniqueidentifier ROWGUIDCOL<br />)<br />GO<br />SELECT $ROWGUID FROM SomeTable<br />GO<br />DROP TABLE SomeTableUnknownnoreply@blogger.com0tag:blogger.com,1999:blog-25407276.post-1161293281146499162006-10-19T23:20:00.000+02:002006-10-19T23:28:01.156+02:00SQL Server 2005 System Views MapI always seem to forget the link for the great <a href="http://www.microsoft.com/downloads/details.aspx?FamilyID=2ec9e842-40be-4321-9b56-92fd3860fb32&DisplayLang=en">SQL Server 2005 System Views Map</a> so I might as well dedicate a little post about it. The map gives an overview that seems quite chaotic at first, but when you zoom in you can find valuable information to be able to combine several related dmv's.Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-25407276.post-1158094925861904932006-09-12T22:55:00.000+02:002006-10-02T20:52:59.096+02:00Be creative with BCPMy favorite colleague - you know the one :) - left me with a simple task before he left on a holiday. I had to finish this export for him that we decided to rewrite. He had to export several quite large recordsets to multiple XML-files. We agreed to use FOR XML EXPLICIT instead of a custom made SSIS component to be able to deploy it on our current production environment, still being SQL Server 2000.<br /><br />When using FOR XML EXPLICIT, you don't get that many options to export the result to a file because of the special nature of that result. But with BCP, everything is possible - at least I thought so. Because of the complexity of the export, some data had to be staged temporarily in the stored procedure that retrieved the data and formatted the XML. And that's where the trouble started.<br />With the code below you simulate the core of the problem - which isn't FOR XML by the way.<br /><br />USE TempDB<br />GO<br /><br />CREATE PROC usp_bcp_test<br />AS<br />SET NOCOUNT ON<br />IF 1=0<br />BEGIN<br />DECLARE @Tbl TABLE (Id INT)<br />INSERT @Tbl VALUES (1)<br />END<br />SELECT 1 as ColA, 'test' as ColB<br />GO<br /><br />DECLARE @Stmt VARCHAR(1024)<br />SET @Stmt = 'BCP "EXEC TempDB.dbo.usp_bcp_test" QUERYOUT C:\TEMP\tst.txt -c -S ' + @@SERVERNAME + ' -T'<br />EXEC master.dbo.xp_cmdshell @Stmt<br />GO<br /><br />DROP PROC usp_bcp_test<br />GO<br /><br />I try to use the output of the stored procedure as the 'query' for BCP, but this fails because of the INSERT-statement in the stored procedure, even though it's unreachable code. BCP gives the following feedback:<br /><br />SQLState = HY000, NativeError = 0<br />Error = [Microsoft][SQL Native Client]BCP host-files must contain at least one column<br />NULL<br /><br />Using Profiler, I discovered that BCP issued the following statement to determine the resultset it can expect.<br /><br />SET FMTONLY ON EXEC TempDB.dbo.usp_bcp_test SET FMTONLY OFF<br /><br />In the messages pane of SSMS you can see the following:<br /><br />(0 row(s) affected)<br /><br />(0 row(s) affected)<br /><br />Which explains the fact that BCP complains about the stored proc: it expects the INSERT statement to give a columnlist for its resultset, which it doesn't have of course. Moving the SELECT clause to the begin of the stored proc makes this construct work because BCP only looks at the first resultset.<br /><br />The solution to this problem is straightforward: split the retrieval and the formatting over multiple stored procs so that the stored proc used for the BCP only contains a simple SELECT-statement.<br /><br />Nice little detail: I was only able to simulate this behavior in SQL Server 2005, not with SQL Server 2000.Unknownnoreply@blogger.com1tag:blogger.com,1999:blog-25407276.post-1156846120131638972006-08-29T12:02:00.000+02:002006-09-07T20:36:02.730+02:00Plan cache gets cleared"Blog this!", my colleague said to me, so I am. Just a simple note this time to mention that when you change the max degree of parallelism on an instance, the plan cache gets cleared. Kind of logic when you think about it because you are changing the rules for the stored plans, but nevertheless he thought it was useful info.<br /><br />EXEC sp_configure 'max degree of parallelism', 0<br />RECONFIGUREUnknownnoreply@blogger.com2tag:blogger.com,1999:blog-25407276.post-1155032183169660662006-08-08T12:05:00.000+02:002006-08-08T12:20:10.626+02:00Behavior of Logins/secWe are in the middle of migrating our production environment from SQL Server 2000 to SQL Server 2005, but during the stress tests this week I noticed something scary. The performance counter "SQL Server:General Statistics - Logins/sec" indicated a factor 1000 more for SQL 2005. I suspected a severe problem with our connection pooling but in spite of that very high number of logins, there were very little logouts/sec and the number of user connections stayed stable. Also Profiler didn't capture those logins (actual and failed ones) and the familiar sp_reset_connection showed up whenever a connection was reused.<br /><br />Nothing wrong with the connection pooling luckily, but it seems that the behavior of the performance counter has changed between versions. It seems to indicate not only the actual logins, but also the reuse of connections because of connection pooling. A colleague put the matter on a forum for confirmation. Tbc?Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-25407276.post-1152214902663633722006-07-06T20:55:00.000+02:002006-07-06T22:00:44.170+02:00Using fn_dblogI like the fn_dblog function even if it is undocumented. It enables you to read from you transaction log which contains very valuable information about stuff that is happening in your database. It is for example the only way I now at this time to determin which object experienced a page split if you search for the LOP_DELETE_SPLIT operation.<br /><br />SELECT *<br />FROM ::fn_dblog(NULL, NULL)<br />WHERE operation = 'LOP_DELETE_SPLIT'<br /><br />This is the usage of that system function you see everywhere, just passing NULL's for the @start and @end parameters, which could limit the LSN's (Log Sequence Number) to be searched. It took me some time to figure it out because just passing the output of the function as a parameter simply doesn't work. You can't use a LSN like for example 00000073:000020fa:0001 as a value for the parameters, you have to translate the hexadecimal values into its numeric equivalent (115:8442:1).<br /><br />Let's show this with some code and because there's no easy way to make this conversion, I came up with something quick & dirty. At last an example of the actual usage of the parameters of fn_dblog on the web ;-)<br /><br />SET NOCOUNT ON<br />DECLARE @LSN NVARCHAR(46)<br />DECLARE @LSN_HEX NVARCHAR(25)<br />DECLARE @tbl TABLE (id INT identity(1,1), i VARCHAR(10))<br />DECLARE @stmt VARCHAR(256)<br /><br />SET @LSN = (SELECT TOP 1 [Current LSN] FROM fn_dblog(NULL, NULL))<br />PRINT @LSN<br /><br />SET @stmt = 'SELECT CAST(0x' + SUBSTRING(@LSN, 1, 8) + ' AS INT)'<br />INSERT @tbl EXEC(@stmt)<br />SET @stmt = 'SELECT CAST(0x' + SUBSTRING(@LSN, 10, 8) + ' AS INT)'<br />INSERT @tbl EXEC(@stmt)<br />SET @stmt = 'SELECT CAST(0x' + SUBSTRING(@LSN, 19, 4) + ' AS INT)'<br />INSERT @tbl EXEC(@stmt)<br /><br />SET @LSN_HEX =<br /> (SELECT i FROM @tbl WHERE id = 1) + ':' + (SELECT i FROM @tbl WHERE id = 2) + ':' + (SELECT i FROM @tbl WHERE id = 3)<br />PRINT @LSN_HEX<br /><br />SELECT *<br />FROM ::fn_dblog(@LSN_HEX, NULL)Unknownnoreply@blogger.com8tag:blogger.com,1999:blog-25407276.post-1151527678133039692006-06-28T22:35:00.000+02:002006-06-28T22:47:58.146+02:00Missing checkpoints in ProfilerEarlier on, I reported the new possibility <a href="http://killspid.blogspot.com/2006/05/monitoring-checkpoints-in-profiler.html">to trace the start of a checkpoint from within profiler</a>. I tested that new event with manual checkpoint-statements, because that's exactly what happens in the production environment of our application where we force them.<br /><br />Anyway, we just discovered that with a 'normal' checkpoint, triggered by the database engine itself, the event isn't raised. You can easily simulate this behavior by comparing a profiler trace with the perform counter MSSQL:Buffer Manager-Checkpoint pages/sec on a database with some load. Compare the periodic checkpoint with one you forced yourself. Weird and disappointing.Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-25407276.post-1150836237032595672006-06-20T22:10:00.000+02:002006-06-20T22:43:57.043+02:00Deployed package versionYou develop SSIS-packages from within the Business Intelligence Development Studio (or better, just call it Visual Studio) and you use Management Studio for deploying and managing packages on a certain environment. Simple question I got today from a colleague: how do you determine the version that is deployed on a server?<br /><br />In the old days with DTS-packages that was straightforward, but now with SSIS, I couldn't find the elegant answer straightaway. Yes, you can export the package to the file system and open it in VS, but that's no decent solution. Querying msdb like the solution below - it's an option:<br /><br />WITH Folder (FolderID, ParentFolderID, FolderName) AS<br />(<br />SELECT folderid, parentfolderid, cast(foldername as varchar(max))<br />FROM msdb.dbo.sysdtspackagefolders90<br />WHERE parentfolderid IS NULL<br /><br />UNION ALL<br /><br />SELECT f.folderid, f.parentfolderid, cast(pf.FolderName + '\' + f.foldername as varchar(max))<br />FROM msdb.dbo.sysdtspackagefolders90 f<br />INNER JOIN Folder pf on pf.FolderID = f.parentfolderid<br />)<br />SELECT f.FolderName, p.name as PackageName, p.vermajor, p.verminor, p.verbuild<br />FROM msdb.dbo.sysdtspackages90 p<br />INNER JOIN Folder f on f.FolderID = p.folderid<br /><br />That was the best I could come up with at that time, but at home I have more time to look into it. And actually it's quite simple, because of the great Summary-window (F7) in SSMS. It contains one report (General) when you connect to SSIS, click on a package and there is all the info you can consult in the system tables. Powerful stuff!Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-25407276.post-1148924065063750362006-05-29T19:05:00.000+02:002006-05-29T19:38:32.466+02:00Monitoring checkpoints in ProfilerBack in the days when I was still in the stresstest-team of our application, I tried to monitor as much as possible from within the SQL Server Profiler. Ok, you still need to use Perfmon of course to capture the most common Windows performance counters, but I find it very handy to have an indication of when exactly checkpoints occur along with the statements that are less performant because of it.<br /><br />In SQL Server 2000, I was forced to write a little routine to trigger my own (user defined) events for the Profiler as there are no standard events that can be monitored. I used the sysperfinfo view to provide me of an indication of when a checkpoint was in progress. I guess that reading the transaction log of the monitored database is a valid alternative, but I haven't actually tried that one (ah... the greatly undocumented ::fn_dblog()).<br />Either way, both methods require custom scripting and are not accurate.<br /><br />Now, in SQL Server 2005, you get of course the option to <a href="http://msdn2.microsoft.com/en-us/library/ms191152.aspx">correlate performance counters</a> with events logged in Profiler, which is a very neat feature, but I like to store all the info in one SQL Trace file.<br />Luckily, you also get the <a href="http://msdn2.microsoft.com/en-us/library/ms190744.aspx">Security Audit Database Operation Event</a> which is raised when a checkpoint begins. It's a pity that there's no counterpart for the end of a checkpoint :(.Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-25407276.post-1147461961275133902006-05-12T21:12:00.000+02:002006-05-30T01:33:21.420+02:00Performance tuning guide for SQL ServerI'm no Google-guru, so I actually had a lot of trouble finding the following documents. But because this is my long term memory, I couldn't do without them.<br />So here it is, some indispensable old school performance tuning stuff: <a href="http://www.microsoft.com/technet/prodtechnol/sql/2000/maintain/rdbmspft.mspx">Microsoft SQL Server 2000 RDBMS Performance Tuning Guide for Data Warehousing</a>. And the even more unfindable <a href="http://download.microsoft.com/download/9/4/f/94fc2880-d963-4906-97cb-821d376c9d45/RDBMSPerfTuning.doc">document version</a>.<br /><br />And it also has a nice equivalent for 2005: <a href="http://www.microsoft.com/technet/prodtechnol/sql/2005/tsprfprb.mspx">Troubleshooting Performance Problems in SQL Server 2005</a>.Unknownnoreply@blogger.com0