Friday, March 30, 2012

Help optimising a stored proc

Hi I have the following procedure that accepts two CSV lists of values,
the first list contains primary key numbers, and the second values to
update.
When the list gets over about 200 items, I am getting intermittent
Timout errors.
Currently just over 500,000 records in the table.
Is there a way to optimise the performace of the update? Or is theer an
easier way if the input can be provided in a single list
e.g 2245=1,2257=2,3367=2 instead of
2245,2257,3367 and 1,2,2
Thanks!
CREATE Procedure dbo.UpdateResults
@.RegistrationIDs Varchar(8000),
@.Results Varchar(4000)
AS
UPDATE tblRegistrations
SET Result
= (SELECT A.Value FROM dbo.Split(@.Results,',') A
JOIN dbo.Split(@.RegistrationIDs,',') B ON A.id = B.id
WHERE RegistrationID=B.Value )
WHERE EXISTS (SELECT *
FROM dbo.Split(@.Results,',') A
JOIN dbo.Split(@.RegistrationIDs,',') B ON A.id = B.id
WHERE RegistrationID=B.Value)
CREATE FUNCTION dbo.Split
(
@.List varchar(8000),
@.SplitOn nvarchar(5)
)
RETURNS @.RtnValue table
(
Id int identity(1,1),
Value nvarchar(150)
)
AS
BEGIN
While (Charindex(@.SplitOn,@.List)>0)
Begin
Insert Into @.RtnValue (value)
Select
Value =
ltrim(rtrim(Substring(@.List,1,Charindex(
@.SplitOn,@.List)-1)))
Set @.List =
Substring(@.List,Charindex(@.SplitOn,@.List
)+len(@.SplitOn),len(@.List))
End
Insert Into @.RtnValue (Value)
Select Value = ltrim(rtrim(@.List))
Return
ENDI would try using temporary tables instead of table variables. I have seen
some strange stuff happen when trying to join two table variables together,
or joining a table-valued function together with a real base table.
Move the logic of the dbo.split funtion inside the stored procedure, take
the input csv strings and write them into a temporary table, then process
from that temp table.
"hals_left" wrote:

> Hi I have the following procedure that accepts two CSV lists of values,
> the first list contains primary key numbers, and the second values to
> update.
> When the list gets over about 200 items, I am getting intermittent
> Timout errors.
> Currently just over 500,000 records in the table.
> Is there a way to optimise the performace of the update? Or is theer an
> easier way if the input can be provided in a single list
> e.g 2245=1,2257=2,3367=2 instead of
> 2245,2257,3367 and 1,2,2
> Thanks!
>
> CREATE Procedure dbo.UpdateResults
> @.RegistrationIDs Varchar(8000),
> @.Results Varchar(4000)
> AS
> UPDATE tblRegistrations
> SET Result
> = (SELECT A.Value FROM dbo.Split(@.Results,',') A
> JOIN dbo.Split(@.RegistrationIDs,',') B ON A.id = B.id
> WHERE RegistrationID=B.Value )
> WHERE EXISTS (SELECT *
> FROM dbo.Split(@.Results,',') A
> JOIN dbo.Split(@.RegistrationIDs,',') B ON A.id = B.id
> WHERE RegistrationID=B.Value)
>
> CREATE FUNCTION dbo.Split
> (
> @.List varchar(8000),
> @.SplitOn nvarchar(5)
> )
> RETURNS @.RtnValue table
> (
> Id int identity(1,1),
> Value nvarchar(150)
> )
> AS
> BEGIN
> While (Charindex(@.SplitOn,@.List)>0)
> Begin
> Insert Into @.RtnValue (value)
> Select
> Value =
> ltrim(rtrim(Substring(@.List,1,Charindex(
@.SplitOn,@.List)-1)))
> Set @.List =
> Substring(@.List,Charindex(@.SplitOn,@.List
)+len(@.SplitOn),len(@.List))
> End
> Insert Into @.RtnValue (Value)
> Select Value = ltrim(rtrim(@.List))
> Return
> END
>|||Hi There,
What Mark suggested is right but you may like to try this .
UPDATE T1
SET Result = A.Value
>From FROM dbo.Split(@.Results,',') A
JOIN dbo.Split(@.RegistrationIDs,',') B ON A.id = B.id
Join tblRegistrations T1 On T1.RegistrationID=B.Value
and remove the split function If possible .
With Warm regards
Jatinder Singh
Mark Williams wrote:
> I would try using temporary tables instead of table variables. I have seen
> some strange stuff happen when trying to join two table variables together
,
> or joining a table-valued function together with a real base table.
> Move the logic of the dbo.split funtion inside the stored procedure, take
> the input csv strings and write them into a temporary table, then process
> from that temp table.
> --
> "hals_left" wrote:
>|||Thanks Jatinder, that works nice on a small test and is much simpler
SQL.
I'l test in on the real database and see how it compares to the
original
I dont really see how can I remove the split function, unless I pass
the data 1 record at a time in a loop ...

No comments:

Post a Comment