I have a windows service where I am using Threadpool.QueueUserWorkItem. The service connects to multiple client databases, grabs data, converts to XLS and send files to the corresponding FTP.
I have 3 questions regarding the code below:
- Am I using the Threadpool.QueueUserWorkItem correctly?
- Do I need to use Lock anywhere in the code to avoid issues? If yes, where and to what object.
- Is there anything that is not correct in the code? If yes, what and how to deal with it?
Code:
private static System.Timers.Timer aTimer = new System.Timers.Timer(50000);
public void OnStart(string[] args)
{
CLE.WriteToEventLog("Service Started");
try
{
aTimer.Elapsed += new ElapsedEventHandler(PerformTimerOperation);
aTimer.Enabled = true;
}
catch (Exception ex)
{
CLE.WriteToEventLog("Error Starting Service: " + ex.Message);
}
}
private void PerformTimerOperation(object source, ElapsedEventArgs e)
{
CLE.WriteToEventLog("Timer Operation Started");
Clients objClient = new Clients();
List<Clients> objClientList = Clients.GetClientList();
foreach (var list in objClientList)
{
ThreadPool.QueueUserWorkItem(new WaitCallback(SendFilesToClient), list);
}
}
private void SendFilesToClient(Object stateInfo)
{
CLE.WriteToEventLog("Send Files To Client Started");
Clients oClient = (Clients)stateInfo;
CLE.WriteToEventLog("Start Proecessing Client: " + oClient.ClientName + ", ClientId: " + oClient.ClientId);
connectionString = App.Database.PrimaryConnectionString(oClient.ClientId);
string reports = oClient.Reports;
string[] values = reports.Split(',').Select(sValue => sValue.Trim()).ToArray();
foreach (string item in values)
{
//Send data to FTP based on cliend id
}
// At this point all reports are being sent to the FTP. We will update the database with LastExecutionDateTime + 1 hour. This will be used as DateFrom param for all reports for the next execution.
}
The service works fine and I get appropriate results, but I need to make sure I am doing it right and don't run in to issues later.
I'm assuming your service is meant to keep running as opposed to being "one-and-done". If so, please note that the
AutoReset
property of theSystem.Timers.Timer
class is set totrue
by default. This simply means that the timer will continue to raise theElapsed
event each time the 50-second interval elapses (50000 milliseconds = 50 seconds). If you know for sure that all of theSendFilesToClient
operations complete in plenty of time before the next interval elapses, then you should be ok. However, I wouldn't bet on it. What if the database is on the network and the network goes down? What if the service is running on a slower system or one with fewer cores and all the work isn't completed in time?You could potentially get around this by turning off the
AutoReset
feature like so.This means the
Elapsed
event will only fire once. Inside thePerformTimerOperation
, simply reset theEnabled
property totrue
to restart the timer before exiting.But this is an incomplete solution because the threads may still take too long to finish before the timer triggers another
Elapsed
event. In this case, you might want to use aManualResetEvent
to signal when each thread is complete and suspend exitingPerformTimerOperation
(and resetting the timer) until that occurs. For example,Now update
SendFilesToClient
.In this fashion, the
PerformTimerOperation
will block on theWaitHandle.WaitAll()
call until all the worker threads, e.g.,SendFilesToClient
, signal that they are finished. At that point, you reset the timer and repeat on the next interval.Sorry this is so long. Hope it helps.