It's very hard to find detailed but simple description of worker and I/O threads in .NET
What's clear to me regarding this topic (but may not be technically precise):
- Worker threads are threads that should employ CPU for their work;
- I/O threads (also called "completion port threads") should employ device drivers for their work and essentially "do nothing", only monitor the completion of non-CPU operations.
What is not clear:
- Although method ThreadPool.GetAvailableThreads returns number of available threads of both types, it seems there is no public API to schedule work for I/O thread. You can only manually create worker thread in .NET?
- It seems that single I/O thread can monitor multiple I/O operations. Is it true? If so, why ThreadPool has so many available I/O threads by default?
- In some texts I read that callback, triggered after I/O operation completion is performed by I/O thread. Is it true? Isn’t this a job for worker thread, considering that this callback is CPU operation?
- To be more specific – do ASP.NET asynchronous pages user I/O threads? What exactly is performance benefit in switching I/O work to separate thread instead of increasing maximum number of worker threads? Is it because single I/O thread does monitor multiple operations? Or Windows does more efficient context switching when using I/O threads?
I'll begin with a description of how asynchronous I/O is used by programs in NT.
You may be familiar with the Win32 API function ReadFile (as an example), which is a wrapper around the Native API function NtReadFile. This function allows you to do two things with asynchronous I/O:
There is however a third way of being notified when an I/O operation completes. You can create an I/O completion port object and associate file handles with it. Whenever an operation is completed on a file which is associated with the I/O completion port, the results of the operation (like I/O status) is queued to the I/O completion port. You can then set up a dedicated thread to remove results from the queue and perform the appropriate tasks like calling callback functions. This is essentially what an "I/O worker thread" is.
A normal "worker thread" is very similar; instead of removing I/O results from a queue, it removes work items from a queue. You can queue work items (QueueUserWorkItem) and have the worker threads execute them. This prevents you from having to spawn a thread every single time you want to perform a task asynchronously.
The term 'worker thread' in .net/CLR typically just refers to any thread other than the Main thread that does some 'work' on behalf of the application that spawned the thread. 'Work' could really mean anything, including waiting for some I/O to complete. The ThreadPool keeps a cache of worker threads because threads are expensive to create.
The term 'I/O thread' in .net/CLR refers to the threads the ThreadPool reserves in order to dispatch NativeOverlapped callbacks from "overlapped" win32 calls (also known as "completion port I/O"). The CLR maintains its own I/O completion port, and can bind any handle to it (via the ThreadPool.BindHandle API). Example here: http://blogs.msdn.com/junfeng/archive/2008/12/01/threadpool-bindhandle.aspx. Many .net APIs use this mechanism internally to receive NativeOverlapped callbacks, though the typical .net developer won't ever use it directly.
There is really no technical difference between 'worker thread' and 'I/O thread' -- they are both just normal threads. But the CLR ThreadPool keeps separate pools of each simply to avoid a situation where high demand on worker threads exhausts all the threads available to dispatch native I/O callbacks, potentially leading to deadlock. (Imagine an application using all 250 worker threads, where each one is waiting for some I/O to complete).
The developer does need to take some care when handling an I/O callback in order to ensure that the I/O thread is returned to the ThreadPool -- that is, I/O callback code should do the minimum work required to service the callback and then return control of the thread to the CLR threadpool. If more work is required, that work should be scheduled on a worker thread. Otherwise, the application risks 'hijacking' the CLR's pool of reserved I/O completion threads for use as normal worker threads, leading to the deadlock situation described above.
Some good references for further reading: win32 I/O completion ports: http://msdn.microsoft.com/en-us/library/aa365198(VS.85).aspx managed threadpool: http://msdn.microsoft.com/en-us/library/0ka9477y.aspx example of BindHandle: http://blogs.msdn.com/junfeng/archive/2008/12/01/threadpool-bindhandle.aspx
Someone with more skills than me is going to jump in here to help out.
Worker threads have a lot of state, they are scheduled by the processor etc. and you control everything they do.
IO Completion Ports are provided by the operating system for very specific tasks involving little shared state, and thus are faster to use. A good example in .Net is the WCF framework. Every "call" to a WCF service is actually executed by an IO Completion Port because they are the fastest to launch and the OS looks after them for you.
Simply put a worker thread is meant to perform a short period of work and will delete itself when it has completed it. A callback may be used to notify the parent process that it has completed or to pass back data.
An I/O thread will perform the same operation or series of operations continuously until stopped by the parent process. It is so called because it typically device drivers run continuously monitor the device port. An I/O thread will typically create Events whenever it wishes to communicate to other threads.
All processes run as threads. Your application runs as a thread. Any thread may spawn worker threads or I/O threads (as you call them).
There is always a fine balance between performance and the number or type of threads used. Too many callbacks or Events handled by a process will severely degrade its performance due to the number of interruptions to its main process loop as it handles them.
Examples of a worker thread would be to add data into a database after user interaction or to perform a long mathematical calculation or write data to a file. By using a worker thread you free up the main application, this is most useful for GUIs as it doesn't freeze whilst the task is being performed.