Tuesday, October 16, 2007
SSIS, 64-bit and a breakpoint
A quick search on the specialized forums returned some hits - mostly related to the Script task the package contained - but nothing really described the conditions I faced. The only difference I could think of was the fact that the acceptance environment was a 64-bit edition, unlike development. In such a case, the settings of the Script task are quite simple: the PrecompileScriptIntoBinaryCode flag should be on.
There was a non-active breakpoint however that caught my attention which I hadn't noticed before because the Script task was not part of my changes to the package and the tests had never halted at that break. After removing the breakpoint, the package effectively ran without a problem.
It's a shame we had to discover this bug the hard way, but it proves that an acceptance environment should stick as close to production as possible. In this case, it payed off.
Monday, October 15, 2007
SSIS's interaction with DTC
Investigating the blocking, I encountered some SPID -2 owned locks in sys.dm_tran_locks, thus blocking the data flow task from retrieving the necessary metadata from the database at the PreExecute event. I fiddled around with the setting of the TransactionOption and IsolationLevel of the different components, but nothing resolved my problem adequately. With the help of profiler however, I could determine that SSIS did not propagate the DTC transaction - started for the SQL task - to the Data Flow task.
The blocking can be resolved by either changing the ValidateExternalMetadata attribute of the OLE DB Destination or changing the TRUNCATE TABLE of the SQL task into a DELETE statement.
The reason why both of these solutions have such an impact remains unclear to me. It's true that each of these modifications affect the locks that are required by the package, but that doesn't explain the fact that the Data Flow task was not enlisted appropriately into MSDTC.
Thursday, September 13, 2007
sys.indexes and table valued functions
In the definition of the return value of a table valued function, you can define the primary key for that table and although it isn't an actual table - just a table variable created at run-time - metadata about the index is kept similar to any user table.
USE tempdb
GO
CREATE FUNCTION dbo.udf_TableValuedFunction ()
RETURNS @Table TABLE ( id int PRIMARY KEY , comment varchar(200) )
AS
BEGIN
INSERT @Table VALUES ( 1, 'comment 1' )
INSERT @Table VALUES ( 2, 'comment 2' )
RETURN
END
GO
SELECT *
FROM sys.objects o
JOIN sys.indexes i ON i.object_id = o.object_id
WHERE o.name = 'udf_TableValuedFunction'
GO
DROP FUNCTION dbo.udf_TableValuedFunction
GO
Thursday, May 24, 2007
Unexpected dump
USE TempDB
GO
CREATE PROC proc_error
@Date DATETIME = blabla --no good!
AS
SELECT @Date + 1
GO
EXEC proc_error
GO
DROP PROC proc_error
GO
Msg 0, Level 19, State 0, Procedure proc_error, Line 4
SqlDumpExceptionHandler: Process 65 generated fatal exception c0000005 EXCEPTION_ACCESS_VIOLATION. SQL Server is terminating this process.
Sunday, May 13, 2007
Plan Guides
Apparently, every morning, with a clean procedure cache, the procedure got called with certain parameters which resulted in a not very intuitive, but good plan with a cost of 3 reads. Subsequent calls with different parameters reused the cached plan, but that resulted in > 8000 logical reads and quite some CPU usage. A more intuitive plan resulting from typical parameters had a constant cost of about 8 logical reads for both types of parameters, which can give a really big performance boost, given the fact that the procedure gets called more than 40.000 times a day and delays a vital business process.
Because I'm not allowed to modify anything of the stored procedure (yep, it's CRM), I had to use a plan guide to 'guide' the engine to the more performant plan. There are several options: specifying the whole plan, add the parameters for the compilation, force recompile... I chose the parameter hint.
--solution, add plan guide
exec sp_create_plan_guide
@name = N'Plan for procedure p_GrantInheritedAccess UPDATE'
, @stmt = N'UPDATE poa1
SET InheritedAccessRightsMask = poa1.InheritedAccessRightsMask poa2.AccessRightsMask poa2.InheritedAccessRightsMask 0x08000000
FROM PrincipalObjectAccess poa1 JOIN PrincipalObjectAccess poa2 ON (poa1.PrincipalId = poa2.PrincipalId)
WHERE poa1.ObjectId = @ReferencingObjectId
AND poa1.ObjectTypeCode = @ReferencingObjectTypeCode
AND poa2.ObjectId = @ReferencedObjectId
AND poa2.ObjectTypeCode = @ReferencedObjectTypeCode'
, @type = N'OBJECT'
, @module_or_batch = N'p_GrantInheritedAccess'
, @params = NULL
, @hints = N'OPTION(OPTIMIZE FOR(@ReferencingObjectId=''D4F70FD5-41FA-DB11-9668-0011258C53FD'',@ReferencingObjectTypeCode=4200,@ReferencedObjectId=''2052793D-41FA-DB11-9668-0011258C53FD'',@ReferencedObjectTypeCode=4200))'
SELECT * FROM sys.plan_guides
The result of this simple operation is a clean and transparent solution. I haven't put it into production though, because it stays a last resort, but it's always good to know your options.
Wednesday, May 09, 2007
MS CRM, from a database perspective
We started the cleanup with resolving some major issues in the custom code. That code was written by rather inexperienced developers with little supervision on their creations. That resulted in some very interesting behavior on our production environment from time to time, at least if you're in for a challenge. The problems can be roughly categorized into two groups: locking issues and suboptimal plan usage, mostly due to wrong datatype usage.
After the resolution of the most urgent issues on the customization, I got the chance to take a look on the actual MSCRM database. And it isn't always a pretty sight. Sure, the system of a controlling Metabase and a generic customizable CRM database are pretty neat, but the statements that I see passing in the profiler traces are often quite unmanageable. In exchange, the application itself offers very flexible functionality, but it doesn't always seem to scale that well to the data volumes we are handling in our company.
Well, working with MS CRM 3.0 is proving to be an enriching experience. You actually shouldn't have to look at the database and you should definitely not temper with it, but Sql Server 2005 offers very interesting features to pinpoint the culprits and to help the engine a little in the right direction. More to come in hopefully a little more technical posts ;-).
Thursday, January 25, 2007
SSIS BypassPrepare
Yesterday, we stumbled into an annoying bug of the RTM version of SQL Server 2005. A new version of a previous project of mine was being released with a modified SSIS package. One of the modifications consists of the usage of a mapped input parameter in an Execute SQL Task. It requires the BypassPrepare flag of that task to be changed to True to make this construct work. It was succesfully tested in development, acceptance and then, in production, we suddenly got the following error.
Executing the query "…" failed with the following error: "Syntax error, permission violation, or other nonspecific error". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
This error lists almost everything that can possibly go wrong with an Execute SQL Task without giving any indication what actually went wrong. Sigh.
At first I thought that the problem was caused by the fact that our production environment didn't had SP1 and the necessary hotfixes like it should have. That's an essential factor indeed, but there was more to the problem. The trouble with the RTM version is that it doesn’t take this BypassPrepare into account. Whether you set this flag to true or false in the development environment, if you look at the actual xml that represents the package, you won't find a trace of this setting. This was fixed in SP1 (as correctly answered on the forums).
What I didn't know was that our operational DBA did export and import the package with his local client (SSMS) on which he didn't have SP1 either. Similar to the behavior in Visual Studio, the setting gets lost with SSMS in RTM version. After a quick install and verification, everything worked smoothly.
In the upcoming SP2 it is announced that the default of this setting will be changed to True, meaning that SSIS won't prepare that statement unless you explicitly specify it do so. It could have saved us some trouble. Valuable lesson learned is to keep all environments, and also the clients, always up to date.