ASP.NET Background thread Performance Guidance

2019-06-02 00:04发布

I am running a background thread in my asp.net web service application. This thread's responsibility is to hit database after a specific time and update a datatable in the Cache. The data table has around 500K rows. In task manager when I look in processes, the web dev server for first time consumes around 300,000K on next time it goes to 500,000K and some times it reaches above 1,000,000K and sometimes drop back to 500,000-600,000K. As I am doing work on my local machine so data in database is not changing. Can anyone please guide me what I am doing wrong in the code:

protected void Application_Start(object sender, EventArgs e)
    {
        Thread obj = new Thread(new ThreadStart(AddDataInCache));
        obj.IsBackground = true;
        obj.Start();
    }

private void AddDataInCache()
    {
        Int32 iCount = 0;
        while (true)
        {
            MyCollection _myCollection = new MyCollection();
            DataTable dtReferences = null;
            DataTable dtMainData = null;
            try
            {
                dtMainData = _myCollection.GetAllDataForCaching(ref dtReferences);

                HttpRuntime.Cache.Insert("DATA_ALL_CACHING", dtMainData, null,
                    Cache.NoAbsoluteExpiration, Cache.NoSlidingExpiration,
                    CacheItemPriority.Default, null);

                HttpRuntime.Cache.Insert("DATA_REFERENCES_CACHING", dtReferences, null,
                    Cache.NoAbsoluteExpiration, Cache.NoSlidingExpiration,
                    CacheItemPriority.NotRemovable, null
                    );
            }
            catch (Exception ex)
            {

            }
            finally
            {
                if (_myCollection != null)
                    _myCollection = null;

            }
            iCount++;
            Thread.Sleep(18000);
        }
    }

In GetAllDataForCaching I am getting a SqlDataReader from my Data Access layer as:

public DataTable GetAllDataForCaching(ref DataTable dReferenceTable)
{
      DataTable dtReturn = new DataTable();
      SqlDataReader dReader = null;
      try
      {
            dReader = SqlHelper.ExecuteReader(CommandType.StoredProcedure, "[GetDataForCaching]", null);
            if (dReader != null && dReader.HasRows)
            {
                  dtReturn.Load(dReader);
                  dReferenceTable = new DataTable();
                  if (dReader.HasRows)
                {
                    DataTable dtSchema = dReader.GetSchemaTable();
                    List<DataColumn> listCols = new List<DataColumn>();

                    if (dtSchema != null)
                    {
                        foreach (DataRow drow in dtSchema.Rows)
                        {
                            string columnName = System.Convert.ToString(drow["ColumnName"]);
                            DataColumn column = new DataColumn(columnName, (Type)(drow["DataType"]));
                            column.Unique = (bool)drow["IsUnique"];
                            column.AllowDBNull = (bool)drow["AllowDBNull"];
                            column.AutoIncrement = (bool)drow["IsAutoIncrement"];
                            listCols.Add(column);
                            dReferenceTable.Columns.Add(column);
                        }
                    }

                    while (dReader.Read())
                    {
                        DataRow dataRow = dReferenceTable.NewRow();
                        for (int i = 0; i < listCols.Count; i++)
                        {
                            dataRow[((DataColumn)listCols[i])] = dReader[i];
                        }
                        dReferenceTable.Rows.Add(dataRow);
                    }
                }
            }
      }
      finally
        {
            if (dReader != null)
            {
                if (dReader.IsClosed == false)
                    dReader.Close();
                dReader = null;
            }
        }
    return dtReturn;
}

I am using Visual Studio 2008.

5条回答
该账号已被封号
2楼-- · 2019-06-02 00:11

Firstly, you should use using (see IDisposable) when working with database connection, command, reader etc.

Secondly web cache can be cleared because of pool recycling or IIS reset. That's why you cannot rely on having your items in cache "for ever". This is a safe way to get the data:

private DataTable GetDataWithReferences(out DataTable dtReferences)
{
    dtReferences = HttpRuntime.Cache["DATA_REFERENCES_CACHING"];
    DataTable dtMainData = HttpRuntime.Cache["DATA_ALL_CACHING"];
    if ( null == dtMainData )
    {
        dtMainData = _myCollection.GetAllDataForCaching(/*ref - why?*/out dtReferences);
        // cache insert
    }

    return dtMainData;
}
查看更多
三岁会撩人
3楼-- · 2019-06-02 00:15

You will be a lot more efficient with a timer than having the thread sleep like that. Timers are more memory and CPU - efficient.

查看更多
相关推荐>>
4楼-- · 2019-06-02 00:22

I'll start by addressing the follow up question:

... is that sometimes Cache returns null ...

This can be because presumably it takes some time for the background thread to fill the cache. When Application_Start fires, you start up the background thread and then the Application_Start finishes. The application can then move on to other tasks, for instance processing a page.

If during the processing of the page, an attempt is made to access the cache before the initial run of AddDataInCache has finished, then the cache will return null.

Regarding memory consumption, I don't immediately see how you could improve the situation unless you are able to reduce the amount of row in the cached DataTables.

In the first call to AddDataInCache, the cache is empty to begin with. Then your GetAllDataForCaching creates two DataTables and fill them with data. This causes the process to aquire memory to store the data in the DataTables.

On the second and subsequent calls to AddDataInCache, the cache already holds all the data that was fetched on the previous run. And then again you create two new datatables and fill them with data. This causes the memory consuption to go up again in order to hold both the preexisting data in cache and the new data in the DataTables created in the second run. Then once the second run has completed loading the data you overwite the preexisting data in the cache with the new data fetched in the second run.

At his point the data that was in the cache from the first run becomes eligible for garbage collection. But that doesn't mean that the memory will be immediately reclaimed. The memory will be reclaimed when the garbage collector comes around and notices that the DataTables are no longer needed in memory.

Note that the cached items from the first run will only become eligible for garbage collection if no "live" objects are holding a reference to them. Make sure that you keep your cache usage short lived.

And while all of this is going on, your background thread will happily go about its business refreshing the cache. It's therefore possible that a third cache refresh occurs before the garbage collector releases the memory for the DataTables fetched in the first run, causing the memory consumption to increase further.

So in order to decrease the memory consumption, I think you will simply have to reduce the amount of data that you store in the cache (fewer rows, fewer columns). Increasing the time between cache refreshes might also be helpful.

And finally, make sure you are not keeping old versions of the cached objects alive by referencing them in long lived requests/application processes.

查看更多
三岁会撩人
5楼-- · 2019-06-02 00:30

I have done it by putting following code before Thread.Sleep(18000);

GC.Collect();
GC.WaitForPendingFinalizers();

It is keeping the memory in control so far.

查看更多
干净又极端
6楼-- · 2019-06-02 00:36

I agree with Peter and I will recommend you to use System.Threading.Timer, you can find this following link useful:

http://blogs.msdn.com/b/tmarq/archive/2007/07/21/an-ounce-of-prevention-using-system-threading-timer-in-an-asp-net-application.aspx

查看更多
登录 后发表回答