Having researched the concept of asynchronous web development, specifically from this source, I created a sample application to prove the concept.
The solution is composed of 2 ASP.NET Web API applications. The first is a simulated slow endpoint; it waits for 1000 ms before returning a list a custom class called Student:
public IEnumerable<Student> Get()
{
Thread.Sleep(1000);
return new List<Student> { new Student { Name = @"Paul" }, new Student { Name = @"Steve" }, new Student { Name = @"Dave" }, new Student { Name = @"Sue" } };
}
Here is the Student class:
public class Student
{
public string Name { get; set; }
}
This endpoint is hosted in IIS 7 on localhost:4002.
The second application contacts the first using 2 endpoints, one synchronous, the other asynchronous:
public IEnumerable<Student> Get() {
var proxy = WebRequest.Create(@"http://localhost:4002/api/values");
var response = proxy.GetResponse();
var reader = new StreamReader(response.GetResponseStream());
return JsonConvert.DeserializeObject<IEnumerable<Student>>(reader.ReadToEnd());
}
public async Task<IEnumerable<Student>> Get(int id) {
var proxy = new HttpClient();
var getStudents = proxy.GetStreamAsync(@"http://localhost:4002/api/values");
var stream = await getStudents;
var reader = new StreamReader(stream);
return JsonConvert.DeserializeObject<IEnumerable<Student>>(reader.ReadToEnd());
}
It's hosted in IIS 7 on localhost:4001.
Both endpoints work as expected, and return in approx. 1 second. Based on the video in the link above at 13:25, the asynchronous method should release it's Thread, minimizing contention.
I'm running performance tests on the application using Apache Bench. Here are the response times for the synchronous method with 10 concurrent requests:
This is much as I'd expect; more concurrent connections increase contention and extend the response times. However, here are the asynchronous response times:
As you can see, there still seems to be some contention. I would have expected the average response times to be more balanced. If I run the tests on both endpoints with 50 concurrent requests, I still get similar results.
Based on this, it seems that both asynchronous and synchronous methods are running at more or less the same speed (expected), not taking into account the overhead in asynchronous methods, but also that the asynchronous method doesn't seem to be releasing Threads back to the ThreadPool. I'd welcome any comments or clarifications, thanks.
I think there's a pretty good chance you're not testing what you think you're testing. From what I can gather, you're trying to detect releases back to the thread pool by comparing timings and deducing thread injection.
For one thing, the default settings for the thread pool on .NET 4.5 are extremely high. You're not going to hit them with just 10 or 100 simultaneous requests.
Step back for a second and think of what you want to test: does an async method return its thread to the thread pool?
I have a demo that I show to demonstrate this. I didn't want to create a heavy load test for my demo (running on my presentation laptop), so I pulled a little trick: I artificially restrict the thread pool to a more reasonable value.
Once you do that, your test is quite simple: perform that many simultaneous connections, and then perform that many plus one. The synchronous implementation will have to wait for one to complete before starting the last one, while the asynchronous implementation will be able to start them all.
On the server side, first restrict the thread pool threads to the number of processors in the system:
Then do the synchronous and asynchronous implementations:
And finally the client testing code:
On my (8-logical-core) machine, I see output like this:
Which clearly shows that the asynchronous method is returning its thread to the thread pool.