Benchmarking ASP.NET concurrent requests poor resu

2020-07-04 08:22发布

问题:

I have the following code that I benchmark with jMeter and get about 3000 request per second on my localhost machine(the await is missing intentionally to run synchronously):

public async Task<HttpResponseMessage> Get()
{
    var resp = new HttpResponseMessage(HttpStatusCode.OK);
    resp.Content = new StringContent(Thread.CurrentThread.ManagedThreadId.ToString(), Encoding.UTF8, "text/plain");
    return resp;
}

The problem is that when I pause the request for one second like below, for some reason the throughput is down to 10 requests per second for each w3wp.exe process (again the await is missing intentionally to run synchronously):

public async Task<HttpResponseMessage> Get()
    {
        Task.Delay(1000).Wait();
        var resp = new HttpResponseMessage(HttpStatusCode.OK);
        resp.Content = new StringContent(Thread.CurrentThread.ManagedThreadId.ToString(), Encoding.UTF8, "text/plain");
        return resp;
    }

Even when I do use await there is no difference and the 10 requests per second does not improve at all:

public async Task<HttpResponseMessage> Get()
{
    await Task.Delay(1000);
    var resp = new HttpResponseMessage(HttpStatusCode.OK);
    resp.Content = new StringContent(Thread.CurrentThread.ManagedThreadId.ToString(), Encoding.UTF8, "text/plain");
    return resp;
}

I tried all the config settings and nothing makes any change at all: `

web.config

  <system.net>
    <connectionManagement>
      <add address="*" maxconnection="65400" />
    </connectionManagement>
  </system.net>

aspnet.config

  <system.web>
    <applicationPool 
        maxConcurrentThreadsPerCPU="100" />
  </system.web>

machine.config

 <processModel
 autoConfig="false"
 memoryLimit="70"
 maxWorkerThreads="100"
 maxIoThreads="100" />

The configs are set for both x86 and x64

I have 32 gigs of mem and 4 physical cores, Windows 10.

The CPU doesn't go over 10% load when benching the 10 requests per second.

The above code uses WEB API, but of course I reproduce the same results using a HTTP Handler.

回答1:

Here's a possible understanding. One to investigate, anyway.

Task.Delay() creates a new task, whose job is to pause. If I understand right, tasks often get dispatched to the .Net worker pool, which has a limited size. (You can check with ThreadPool.GetMaxThreads) When you try to put too much in, code will 'back up' as it waits for the thread pool to have space.

So let's say you have a thread pool of size 40. Once you've dispatched 40 tasks, all waiting a second, you max out the thread pool. Your bottleneck would be the tasks, gumming up the thread pool, not yielding space.

Normally, tasks that do expensive IO like database queries or file IO yield control while they wait for the work to be done. I wonder if Task.Delay is more 'clingy'.

Try swapping Task.Delay() for System.Threading.Thread.Sleep() and see if that changes anything.



回答2:

I know for Windows 8 there is a maximum concurrent connection limit of 10 to stop people trying to use consumer OS's to run server workloads. I see no reason why Windows 10 would be any different.

http://blogs.iis.net/owscott/windows-8-iis-8-concurrent-requests-limit