Home » ASP.NET » What’s the Truth about Running ASP.NET WEBAPI Asynchronously?

What’s the Truth about Running ASP.NET WEBAPI Asynchronously?

When Node.JS started getting popular, one of the major benefits people were proclaiming about it is that the web servers running under Node.JS were all processing the request asynchronously.  This is how a single threaded environment was able to handle a significant load without falling over.  Cool!  So, you might wonder how does ASP.NET process request?  It processes code synchronously.  So, one might assume that if there were a way of running code asynchronously, we might be able to improve the performance of our applications.  But can we?  And if we can, is it worth it?

What’s the Truth about Running ASP.NET WEBAPI Asynchronously?
Photo via VisualHunt


The general idea of an asynchronous request is that a thread handles the incoming requests and immediately fires up another thread to handle the request.  When the processing thread is done, it calls a callback function to notify the request thread that it is ready to send data back to the client.  Meanwhile, the request thread has been able to handle additional request from additional clients.  In contrast, a synchronous request processes the request itself so it is unable to handle any additional request.

In Node.JS, this is important because there is only one thread.  It handles all of the request.  If we had blocked, there would be no way of processing additional incoming request.

In ASP.NET, we are running in a system that can run multiple thread per process.  So, we are able to handle multiple request because each thread is handled by a different thread.  The problem, however, is that there are a limited number of threads that we can spawn per process and eventually we are handling so much traffic that we can’t handle any more.  The longer our request take to respond, the more likely we are to experience this problem.

Imagine the kind of through put we could get on an ASP.NET application if we could make it run asynchronously, more like how Node.JS does.

So What’s the Problem?

So, all of this sounds really good.  But then when I went to try to find out how it works, I stumbled on a Stack Overflow question that basically indicated that it shouldn’t work.  This, along with a project I was working on, lead me down the road of testing it for myself.  Lesson: “Just because someone says something is true, doesn’t mean it is!”

The Test


WestWind Web Surge (WWWS)

Simple load testing tool. Add a URL, set number of threads to run concurrently, run, check report. There is a command line version you can use as well.


There is a version for WEB API v1 and v2. This allows us to add server and client side caching information to REST end points.

Web Server

I’m using the web server built into Visual Studio 2015.  I’m assuming IIS performs similarly.

All of the tests were run on an 8 core computer with 16 gig of RAM.  The numbers are for relative comparison.  Your tests may show slightly different results.


For a baseline, I ran WWWS against this end point:

public IEnumerable<string> Get() {
    return new string[] { "value1", "value2" };

Notice that while I am returning static information, I am simulating processing time with the Sleep() method. This makes it look like it took a minute to process the information prior to returning the information from the server.

I could run WWWS against the end-point for 20 seconds at a rate of 100 threads at a time without any errors. However, I was only able to achieve 28 request per second with an average response time of 4 seconds. Pretty slow once you see the optimizations.

Asynchronous Optimization

The next test was to see what performance improvements I could get by running the end-point asynchronously.

public async Task<IEnumerable<string>> Get(){
    await Task.Delay(1000);
    return new string[] { "value1", "value2" };

Note: you cannot just add the async keyword to the method, you must also have code you are await-ing inside your method. Otherwise the code will fall back to the baseline code. This is why I replaced Sleep() with Task.Delay(). It gives me something to await, while still waiting for a second prior to returning from the request.

Keeping all of the parameters the same and running this code, request per second improved to 97. Average response time was just over 1 second.

I wasn’t able to get any failures within 20 seconds until I set simultaneous threads to 2000. At 1900 simultaneous threads the average response time was 9 seconds and the request processed per second was 444. So while the response time was longer, more items were able to be processed by simply making the call asynchronous.

So, that seems to prove that making the request asynchronous improves the performance.

Even Better Performance

But wait!  There’s more.

What would it be like if we added caching onto this?

By using the CacheOutput mechanism, we gain further benefit. While it might be difficult to add this in every situation, at least for static output this is something to be considered. I used the Strathweb implementation because it most closely implemented what had previously been available for WebForms and it now available in .NET Core

The code

[CacheOutput(ClientTimeSpan = 50000,ServerTimeSpan = 50000)]

public async Task<IEnumerable<string>> Get()
    await Task.Delay(1000);
    return new string[] { "value1", "value2" };

While WWWS doesn’t allow us to test client caching, it does allow us to test server caching. In the code above I set the cache time to 50 seconds.

Using the same 1900 threads as before, the average response time was 6 seconds with an average request processed per second of 347.

I gave up maxing out the system at 3200 threads. This was returning results on average after 10 seconds of the request with request processed per second of 344.

If you were able to implement smart caching so new data was returned from the server only when something had actually changed, I’m sure you could achieve similar results with data that changed more frequently than your basic lookup tables.


Other post in ASP.NET
Article Name
What’s the Truth about Running ASP.NET WEBAPI Asynchronously?
Do you know for sure how much adding async will improve your web application? Is everything you've heard true? It's a good thing I've done all the work for you. Find out here.

Related Post

  • jQuery, Each() and Async GetsjQuery, Each() and Async Gets One of the things to keep in mind when using jQuery is that nothing is a blocking call.  Sure, there is a certain sequence to when things operate.  But, to be safe, you should […]
  • ASP.NET JSON  and ViewStateASP.NET JSON and ViewState I received the following question recently about my article "ASP.NET AJAX using JSON - Here’s how." If we update the value of a textbox or label via a JSON web service call - will the […]
  • Response.Redirect() executes too soon on the Server.Response.Redirect() executes too soon on the Server. I've seen this question a couple of times in various situations. The first involves Javascript and the second involves server side code. Both are caused by a misunderstanding of what this […]
  • ASP.NET Response.Redirect() and JavaScriptASP.NET Response.Redirect() and JavaScript Yesterday we covered issues surrounding using ASP.NET's Response.Redirect in server side code. We noted that not handing it correctly could prevent code from running on the server that we […]
  • ASP.NET Application_Error Detecting 404’sASP.NET Application_Error Detecting 404’s For many of you, this is going to be a "Duh!" kind of post.  But while working on this today, I found so many people asking this question and so many others giving the wrong answer, I'm […]

About Dave Bush

Dave Bush is a Full Stack ASP.NET developer focusing on ASP.NET, C#, Node.js, JavaScript, HTML, CSS, BootStrap, and Angular.JS. Does your team need additional help in any of the above? Contact Dave today.