I have to handle bulk number of records in SP. For that, i am using WHILE loop as in the following SP. But, it is taking more time to execute. Please help me by suggesting other alternate approach which can improve the performance.
create procedure sp_save_user
(
@a_i_lang_id integer,
@a_s_data ntext
)
WITH ENCRYPTION
as
begin
set nocount on
SET QUOTED_IDENTIFIER ON
--Declaring local variables
declare @l_s_USER_ID NVARCHAR(30)
declare @l_s_USER_NAME NVARCHAR(255)
declare @l_n_rmStatusCount numeric(10)
declare @l_n_XMLDoc XML
set @l_n_XMLDoc = cast(@a_s_data as xml)
CREATE TABLE #DW_TEMP_TABLE_SAVE(
[USER_ID] [NVARCHAR](30),
[USER_NAME] [NVARCHAR](255)
)
insert into #DW_TEMP_TABLE_SAVE
select A.B.value('(USER_ID)[1]', 'nvarchar(30)' ) [USER_ID],
A.B.value('(USER_NAME)[1]', 'nvarchar(30)' ) [USER_NAME]
from @l_n_XMLDoc.nodes('//ROW') as A(B)
--Get total number od records
select @l_n_rmStatusCount = count(*) from #DW_TEMP_TABLE_SAVE
--loop through records and insert/update table
while (@l_n_rmStatusCount > 0)
begin
SELECT @l_s_USER_ID = [USER_ID] ,
@l_s_USER_NAME = [USER_NAME]
FROM ( SELECT ROW_NUMBER() OVER (ORDER BY [USER_ID]) AS rownumber,
[USER_ID],[USER_NAME] FROM #DW_TEMP_TABLE_SAVE) as temptablename
WHERE rownumber = @l_n_rmStatusCount
if exists(
select 'X' from table_user_info(nolock)
where [user_id] = @l_s_USER_ID
)
begin
-- call another SP to do UPDATE multiple tables
end
else
begin
-- call another SP to do INSERT multiple tables
end
set @l_n_rmStatusCount = @l_n_rmStatusCount -1
end
drop table #DW_TEMP_TABLE_SAVE
SET QUOTED_IDENTIFIER OFF
set nocount off
end
go
First, determine what's taking the bulk of the time. Is it the XML shredding or the pure SQL operations? Have your stored procedure output some calls to getdate() before and after the INSERT into #DW_TEMP_TABLE_SAVE, and then again after the end of the loop.
If you find that the bulk of the time is in the loop, then the XML is a red herring. You have to see if you can either optimize the INSERT and UPDATE stored procedures or, better yet, find a way to make SET-based adjustments for all of the records in #DW_TEMP_TABLE_SAVE at one time, rather than looping over each row in #DW_TEMP_TABLE_SAVE. If you need help with this, I would suggest creating a new question focused on this aspect of the problem, providing much more detail on what's happening in those other procedures. To help, we'll also need to know what the tables involved are like-- how big are they, and how are they indexed?
If you find that the bulk of the time is in the XML shredding, you have a different problem. First, as a comment suggested, if you could pass in the data not as XML but as a table-valued parameter, you would eliminate the need to deal with XML at all. But assuming you are stuck with XML, the first thing to do would be to make sure the XML is as slim as possible; try to limit it only to those elements required for your later data modifications. If you can't alter the characteristics of the XML, you are severely limited. You could investigate using OPENXML instead of nodes(); there is some indication it might be faster. Another possibility is to use XML Indexes. Perhaps if you created a temp (or maybe permanent) table with appropriate indexes, and INSERTed the XML into that table, the knowledge of the associated schema would help SQL Server extract the values more quickly.