Do asynchronous operations in ASP.NET MVC use a thread from ThreadPool on .NET 4

C#.Netasp.net Mvcasp.net Mvc-3Asynchronous

C# Problem Overview


> After this question, it makes me comfortable when using async > operations in ASP.NET MVC. So, I wrote two blog posts on that: > > - My Take on Task-based Asynchronous Programming in C# 5.0 and ASP.NET MVC Web Applications > - Asynchronous Database Calls With Task-based Asynchronous Programming Model (TAP) in ASP.NET MVC 4

I have too many misunderstandings in my mind about asynchronous operations on ASP.NET MVC.

I always hear this sentence: Application can scale better if operations run asynchronously

And I heard this kind of sentences a lot as well: if you have a huge volume of traffic, you may be better off not performing your queries asynchronously - consuming 2 extra threads to service one request takes resources away from other incoming requests.

I think those two sentences are inconsistent.

I do not have much information about how threadpool works on ASP.NET but I know that threadpool has a limited size for threads. So, the second sentence has to be related to this issue.

And I would like to know if asynchronous operations in ASP.NET MVC uses a thread from ThreadPool on .NET 4?

For example, when we implement a AsyncController, how does the app structures? If I get huge traffic, is it a good idea to implement AsyncController?

Is there anybody out there who can take this black curtain away in front of my eyes and explain me the deal about asynchrony on ASP.NET MVC 3 (NET 4)?

Edit:

I have read this below document nearly hundreds of times and I understand the main deal but still I have confusion because there are too much inconsistent comment out there.

Using an Asynchronous Controller in ASP.NET MVC

Edit:

Let's assume I have controller action like below (not an implementation of AsyncController though):

public ViewResult Index() { 
    
    Task.Factory.StartNew(() => { 
        //Do an advanced looging here which takes a while
    });
    
    return View();
}

As you see here, I fire an operation and forget about it. Then, I return immediately without waiting it be completed.

In this case, does this have to use a thread from threadpool? If so, after it completes, what happens to that thread? Does GC comes in and clean up just after it completes?

Edit:

For the @Darin's answer, here is a sample of async code which talks to database:

public class FooController : AsyncController {

    //EF 4.2 DbContext instance
    MyContext _context = new MyContext();

    public void IndexAsync() { 
        
        AsyncManager.OutstandingOperations.Increment(3);
        
        Task<IEnumerable<Foo>>.Factory.StartNew(() => { 
        
           return 
                _context.Foos;
        }).ContinueWith(t => {
        
            AsyncManager.Parameters["foos"] = t.Result;
            AsyncManager.OutstandingOperations.Decrement();
        });
        
        Task<IEnumerable<Bars>>.Factory.StartNew(() => { 
        
           return 
                _context.Bars;
        }).ContinueWith(t => {
        
            AsyncManager.Parameters["bars"] = t.Result;
            AsyncManager.OutstandingOperations.Decrement();
        });
        
        Task<IEnumerable<FooBar>>.Factory.StartNew(() => { 
        
           return 
                _context.FooBars;
        }).ContinueWith(t => {
        
            AsyncManager.Parameters["foobars"] = t.Result;
            AsyncManager.OutstandingOperations.Decrement();
        });
    }
    
    public ViewResult IndexCompleted(
        IEnumerable<Foo> foos, 
        IEnumerable<Bar> bars,
        IEnumerable<FooBar> foobars) {
        
        //Do the regular stuff and return
        
    }
}

C# Solutions


Solution 1 - C#

Here's an excellent article I would recommend you reading to better understand asynchronous processing in ASP.NET (which is what asynchronous controllers basically represent).

Let's first consider a standard synchronous action:

public ActionResult Index()
{
    // some processing
    return View();
}

When a request is made to this action a thread is drawn from the thread pool and the body of this action is executed on this thread. So if the processing inside this action is slow you are blocking this thread for the entire processing, so this thread cannot be reused to process other requests. At the end of the request execution, the thread is returned to the thread pool.

Now let's take an example of the asynchronous pattern:

public void IndexAsync()
{
    // perform some processing
}

public ActionResult IndexCompleted(object result)
{
    return View();
}

When a request is sent to the Index action, a thread is drawn from the thread pool and the body of the IndexAsync method is executed. Once the body of this method finishes executing, the thread is returned to the thread pool. Then, using the standard AsyncManager.OutstandingOperations, once you signal the completion of the async operation, another thread is drawn from the thread pool and the body of the IndexCompleted action is executed on it and the result rendered to the client.

So what we can see in this pattern is that a single client HTTP request could be executed by two different threads.

Now the interesting part happens inside the IndexAsync method. If you have a blocking operation inside it, you are totally wasting the whole purpose of the asynchronous controllers because you are blocking the worker thread (remember that the body of this action is executed on a thread drawn from the thread pool).

So when can we take real advantage of asynchronous controllers you might ask?

IMHO we can gain most when we have I/O intensive operations (such as database and network calls to remote services). If you have a CPU intensive operation, asynchronous actions won't bring you much benefit.

So why can we gain benefit from I/O intensive operations? Because we could use I/O Completion Ports. IOCP are extremely powerful because you do not consume any threads or resources on the server during the execution of the entire operation.

How do they work?

Suppose that we want to download the contents of a remote web page using the WebClient.DownloadStringAsync method. You call this method which will register an IOCP within the operating system and return immediately. During the processing of the entire request, no threads are consumed on your server. Everything happens on the remote server. This could take lots of time but you don't care as you are not jeopardizing your worker threads. Once a response is received the IOCP is signaled, a thread is drawn from the thread pool and the callback is executed on this thread. But as you can see, during the entire process, we have not monopolized any threads.

The same stands true with methods such as FileStream.BeginRead, SqlCommand.BeginExecute, ...

What about parallelizing multiple database calls? Suppose that you had a synchronous controller action in which you performed 4 blocking database calls in sequence. It's easy to calculate that if each database call takes 200ms, your controller action will take roughly 800ms to execute.

If you don't need to run those calls sequentially, would parallelizing them improve performance?

That's the big question, which is not easy to answer. Maybe yes, maybe no. It will entirely depend on how you implement those database calls. If you use async controllers and I/O Completion Ports as discussed previously you will boost the performance of this controller action and of other actions as well, as you won't be monopolizing worker threads.

On the other hand if you implement them poorly (with a blocking database call performed on a thread from the thread pool), you will basically lower the total time of execution of this action to roughly 200ms but you would have consumed 4 worker threads so you might have degraded the performance of other requests which might become starving because of missing threads in the pool to process them.

So it is very difficult and if you don't feel ready to perform extensive tests on your application, do not implement asynchronous controllers, as chances are that you will do more damage than benefit. Implement them only if you have a reason to do so: for example you have identified that standard synchronous controller actions are a bottleneck to your application (after performing extensive load tests and measurements of course).

Now let's consider your example:

public ViewResult Index() { 

    Task.Factory.StartNew(() => { 
        //Do an advanced looging here which takes a while
    });

    return View();
}

When a request is received for the Index action a thread is drawn from the thread pool to execute its body, but its body only schedules a new task using TPL. So the action execution ends and the thread is returned to the thread pool. Except that, TPL uses threads from the thread pool to perform their processing. So even if the original thread was returned to the thread pool, you have drawn another thread from this pool to execute the body of the task. So you have jeopardized 2 threads from your precious pool.

Now let's consider the following:

public ViewResult Index() { 

    new Thread(() => { 
        //Do an advanced looging here which takes a while
    }).Start();

    return View();
}

In this case we are manually spawning a thread. In this case the execution of the body of the Index action might take slightly longer (because spawning a new thread is more expensive than drawing one from an existing pool). But the execution of the advanced logging operation will be done on a thread which is not part of the pool. So we are not jeopardizing threads from the pool which remain free for serving another requests.

Solution 2 - C#

Yes - all threads come from the thread-pool. Your MVC app is already multi-threaded, when a request comes in a new thread will be taken from the pool and used to service the request. That thread will be 'locked' (from other requests) until the request is fully serviced and completed. If there is no thread available in the pool the request will have to wait until one is available.

If you have async controllers they still get a thread from the pool but while servicing the request they can give up the thread, while waiting for something to happen (and that thread can be given to another request) and when the original request needs a thread again it gets one from the pool.

The difference is that if you have a lot of long-running requests (where the thread is waiting for a response from something) you might run out of threads from the the pool to service even basic requests. If you have async controllers, you don't have any more threads but those threads that are waiting are returned to the pool and can service other requests.

A nearly real life example... Think of it like getting on a bus, there's five people waiting to get on, the first gets on, pays and sits down (the driver serviced their request), you get on (the driver is servicing your request) but you can't find your money; as you fumble in your pockets the driver gives up on you and gets the next two people on (servicing their requests), when you find your money the driver starts dealing with you again (completing your request) - the fifth person has to wait until you are done but the third and fourth people got served while you were half way through getting served. This means that the driver is the one and only thread from the pool and the passengers are the requests. It was too complicated to write how it would work if there was two drivers but you can imagine...

Without an async controller, the passengers behind you would have to wait ages while you looked for your money, meanwhile the bus driver would be doing no work.

So the conclusion is, if lots of people don't know where their money is (i.e. require a long time to respond to something the driver has asked) async controllers could well help throughput of requests, speeding up the process from some. Without an aysnc controller everyone waits until the person in front has been completely dealt with. BUT don't forget that in MVC you have a lot of bus drivers on a single bus so async is not an automatic choice.

Solution 3 - C#

There are two concepts at play here. First of all we can make our code run in parallel to execute faster or schedule code on another thread to avoid making the user wait. The example you had

public ViewResult Index() { 

    Task.Factory.StartNew(() => { 
        //Do an advanced looging here which takes a while
    });

    return View();
}

belongs to the second category. The user will get a faster response but the total workload on the server is higher because it has to do the same work + handle the threading.

Another example of this would be:

public ViewResult Index() { 
    
    Task.Factory.StartNew(() => { 
        //Make async web request to twitter with WebClient.DownloadString()
    });

    Task.Factory.StartNew(() => { 
        //Make async web request to facebook with WebClient.DownloadString()
    });
        

    //wait for both to be ready and merge the results
    
    return View();
}

Because the requests run in parallel the user won't have to wait as long as if they where done in serial. But you should realize that we use up more resources here than if we ran in serial because we run the code at many threads while we have on thread waiting too.

This is perfectly fine in a client scenario. And it is quite common there to wrap synchronous long running code in a new task(run it on another thread) too keep the ui responsive or parallize to make it faster. A thread is still used for the whole duration though. On a server with high load this could backfire because you actually use more resources. This is what people have warned you about

Async controllers in MVC has another goal though. The point here is to avoid having threads sittings around doing nothing(which can hurt scalability). It really only matters if the API's you are calling have async methods. Like WebClient.DowloadStringAsync().

The point is that you can let your thread be returned to handle new requests untill the web request is finished where it will call you callback which gets the same or a new thread and finish the request.

I hope you understand the difference between asynchronous and parallel. Think of parallel code as code where your thread sits around and wait for the result. While asynchronous code is code where you will be notified when the code is done and you can get back working at it, in the meantime the thread can do other work.

Solution 4 - C#

Applications can scale better if operations run asynchronously, but only if there are resources available to service the additional operations.

Asynchronous operations ensure that you're never blocking an action because an existing one is in progress. ASP.NET has an asynchronous model that allows multiple requests to execute side-by-side. It would be possible to queue the requests up and processes them FIFO, but this would not scale well when you have hundreds of requests queued up and each request takes 100ms to process.

If you have a huge volume of traffic, you may be better off not performing your queries asynchronously, as there may be no additional resources to service the requests. If there are no spare resources, your requests are forced to queue up, take exponentially longer or outright fail, in which case the asynchronous overhead (mutexes and context-switching operations) isn't giving you anything.

As far as ASP.NET goes, you don't have a choice - it's uses an asynchronous model, because that's what makes sense for the server-client model. If you were to be writing your own code internally that uses an async pattern to attempt to scale better, unless you're trying to manage a resource that's shared between all requests, you won't actually see any improvements because they're already wrapped in an asynchronous process that doesn't block anything else.

Ultimately, it's all subjective until you actually look at what's causing a bottleneck in your system. Sometimes it's obvious where an asynchronous pattern will help (by preventing a queued resource blocking). Ultimately only measuring and analysing a system can indicate where you can gain efficiencies.

Edit:

In your example, the Task.Factory.StartNew call will queue up an operation on the .NET thread-pool. The nature of Thread Pool threads is to be re-used (to avoid the cost of creating/destroying lots of threads). Once the operation completes, the thread is released back to the pool to be re-used by another request (the Garbage Collector doesn't actually get involved unless you created some objects in your operations, in which case they're collected as per normal scoping).

As far as ASP.NET is concerned, there is no special operation here. The ASP.NET request completes without respect to the asynchronous task. The only concern might be if your thread pool is saturated (i.e. there are no threads available to service the request right now and the pool's settings don't allow more threads to be created), in which case the request is blocked waiting to start the task until a pool thread becomes available.

Solution 5 - C#

Yes, they use a thread from the thread pool. There is actually a pretty excellent guide from MSDN that will tackle all of your questions and more. I have found it to be quite useful in the past. Check it out!

http://msdn.microsoft.com/en-us/library/ee728598.aspx

Meanwhile, the comments + suggestions that you hear about asynchronous code should be taken with a grain of salt. For starters, just making something async doesn't necessarily make it scale better, and in some cases can make your application scale worse. The other comment you posted about "a huge volume of traffic..." is also only correct in certain contexts. It really depends on what your operations are doing, and how they interact with other parts of the system.

In short, lots of people have lots of opinions about async, but they may not be correct out of context. I'd say focus on your exact problems, and do basic performance testing to see what async controllers, etc. actually handle with your application.

Solution 6 - C#

First thing its not MVC but the IIS who maintains the thread pool. So any request which comes to MVC or ASP.NET application is served from threads which are maintained in thread pool. Only with making the app Asynch he invokes this action in a different thread and releases the thread immediately so that other requests can be taken.

I have explained the same with a detail video (http://www.youtube.com/watch?v=wvg13n5V0V0/ "MVC Asynch controllers and thread starvation" ) which shows how thread starvation happens in MVC and how its minimized by using MVC Asynch controllers.I also have measured the request queues using perfmon so that you can see how request queues are decreased for MVC asynch and how its worst for Synch operations.

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestiontugberkView Question on Stackoverflow
Solution 1 - C#Darin DimitrovView Answer on Stackoverflow
Solution 2 - C#K. BobView Answer on Stackoverflow
Solution 3 - C#Mikael EliassonView Answer on Stackoverflow
Solution 4 - C#Paul TurnerView Answer on Stackoverflow
Solution 5 - C#A.R.View Answer on Stackoverflow
Solution 6 - C#Shivprasad KoiralaView Answer on Stackoverflow