Fast in SSMS and slow in the application - Why doe

2019-02-19 12:40发布

I have a dataset which is filled from a query as follows...

SELECT  DISTINCT ColA, ColB, ColC, ColD, ColE, ColF, dbo.CustomFunction(ColA) AS ColG
FROM    TableA
    JOIN ViewA ON ColA = ViewColA 
WHERE   ColB = @P1 AND ColC = @P2 AND ColD = @P3 AND ColE = @P4 
ORDER BY ColB, ColC DESC, ColA

(Query fields etc obfuscated)

I have profiled this query and it returns around 200 rows in 12 seconds running in SSMS. Note that I restarted the server and used the required DBCC commands to ensure that an existing execution plan wasnt used.

However, when I run this query from my .Net application it takes over 30 seconds to fill the dataset and times out on the default ADO.Net command time out of 30 seconds.

If the query runs in 12 seconds, I just cannot see why it should take more than 18 more seconds to fill 200 rows into a dataset. Unless there is something going on here that I dont know about. I imagine that ADO.Net just calls the query, gets the data and populates it.

The population code looks like this (note I have inherited this from another developer)...

DataSet res = new DataSet();

    try
    {
        using (SqlDataAdapter da = new SqlClient.SqlDataAdapter())
        {
            var cmd = new SqlClient.SqlCommand();
            String params = FillParameters(cmd, _params, params);
            cmd.CommandText = params + SQL;
            cmd.Connection = conn;
            cmd.Transaction = _transaction;

            if (CommandTimeout.HasValue)
            {
                cmd.CommandTimeout = CommandTimeout.Value;
            }

            da.SelectCommand = cmd;
            da.Fill(res);
            return res;
        }
    }
    catch
    {
        throw;
    }

Running this in debug, when the fill method is hit the method takes around 50 seconds to complete. This is proved by setting a high time out on the ADO.Net command. I am happy with the performance of the query which I can run consistently in around 12 seconds so why the additional 18+ seconds to populate the dataset?

Is ADO.Net doing something (possibly due to the structure) of this code that means it takes more than 18 seconds to populate the dataset? I have tried setting EnforceConstraints to false and that makes no difference.

One thing to note is that due to the design of this program, more than the required numbers of parameters are fed into sql command. The FillParameters method does this. There are 20 or so "default" parameters that are added to the command but only e.g. 4 are used by this query.

So in summary,

  • What could be happening to make it take 18+ seconds to fill the DS?

  • Is ADO.Net doing something "clever" with my dataset rather than just running the query and populating the data set?

  • Could it be the excessive amount of parameters passed in that is causing the problem.

Thanks.

4条回答
够拽才男人
2楼-- · 2019-02-19 12:58

Try issuing SET ARITHABORT ON in your query. It did solve my problem

查看更多
3楼-- · 2019-02-19 12:59

Assume your tables have primary keys

1) check existing indexes fragmentation if fragmentation is more than 30%, rebuild index otherwise reorganize 2) check missing index columns create non-cluster indexes based on missing columns

Then re-run your sql script. Generally should improve after managing proper indexes.

查看更多
Luminary・发光体
4楼-- · 2019-02-19 13:06

The problem was that the existing code was enforcing a Serializable isolation level.

I compared using SQL Server Profiler the commands and execution stats from both the query running through SSMS and the appliction.

--- SSMS ---
....
....
set transaction isolation level read committed

CPU: 7797
Reads: 338,425
Writes: 1685
Duration: 7,912

--- Application ---
....
....
set transaction isolation level serializable 

CPU: 46,531
Reads: 241,202
Writes: 0
Duration: 46,792

I then ran the query in SSMS using both Set transaction isolution level serializable AND exec sp_executesql so that SQL Server had no hints from SSMS as to what the query contained.

This reproduced the execution time of 30+ seconds in both SSMS and the application.

It was then just a case of modifying the code to use a Read Committed isolation level.

References: http://www.sommarskog.se/query-plan-mysteries.html#otherreasons

查看更多
Root(大扎)
5楼-- · 2019-02-19 13:06

I think your problem isn't in the code but in the database, if you have a big database then probably this is your problem, try to update the statistics in your database using this:

EXEC sp_updatestats

查看更多
登录 后发表回答