Can tasks always run in parallel in ASP.Net WebAPI applications?

So this is the first post in the blog and the topic is interesting.

A couple days ago, I did a code review for one of our developers, where I made a comment regarding how tasks are scheduled in ASP.Net applications (not ASP.Net Core). I was saying in the code review that no multiple tasks can run in parallel because of the special AspNetSynchronizationContext. Today I was asked the same question by another developer and later figured out I was not 100% correct. So I decided to spend sometime digging a bit deeper into the magic synchronization context and post what I have found.

So the source code of the synchronization context could be found here. For those who are not familiar with synchronization context, I would recommend you read this stackoverflow post before continue reading the rest of the content.

To make some tests, I wrote a very simple synchronization context class which schedules tasks using the same way as how AspNetSynchronizationContext schedules tasks:
// A synchronization context which simulates AspNetSynchronizationContext.
class SimpleAspNetContext : SynchronizationContext
{
    private Task _lastTask = Task.CompletedTask;
    private readonly object _lock = new object();
    
    public override void Post(SendOrPostCallback d, object state)
    {
        lock (_lock)
        {
            // Tasks are chained so that the context guarantees first come first complete.
            // This also guarantees that a new task can be scheduled after the previous task has been completed.
            var newTask = _lastTask.ContinueWith(_ => RunCallback(() => d(state)), TaskScheduler.Default);
            _lastTask = newTask;
        }
    }

    private void RunCallback(Action action)
    {
        // Here the action gets executed in a blocking manner, so if the action is blocking,
        // the continuation created in the Post method is blocked.
        action();
    }
}
So similar to AspNetSynchronizationContext, my dummy version also uses task continuations to chain tasks together, so that it guarantees "first come firs complete" and no task overlapping. I believe the ASP.Net version was implemented in that way because it needs to restore HttpContext.Current and other thread static variables.

The interesting point in the Post method of the synchronization context is the callback function is invoked synchronously in a newly created task continuation on a thread pool thread. That means the callback will continue synchronously until it hits the first await point, where it yields the thread back to thread pool, giving a chance to another task to start running. Therefore, in theory, a blocking task could block other tasks to be scheduled until it is completed.

To prove that, I made some tests:
async Task Main()
{
    var sw = new Stopwatch();
    
    // Separate non-blocking.
    sw.Restart();
    await SeparateAwait(false);
    sw.Stop();
    $"Separate non-blocking: {sw.Elapsed}".Dump();

    // Separate blocking.
    sw.Restart();
    await SeparateAwait(true);
    sw.Stop();
    $"Separate blocking: {sw.Elapsed}".Dump();

    // WhenAll non-blocking.
    sw.Restart();
    await WhenAll(false);
    sw.Stop();
    $"WhenAll non-blocking: {sw.Elapsed}".Dump();

    // WhenAll blocking.
    sw.Restart();
    await WhenAll(true);
    sw.Stop();
    $"WhenAll blocking: {sw.Elapsed}".Dump();
}

// Do something. If isBlocking is true, the task is blocking, otherwise the task is non-blocking.
private async Task DoTask(bool isBlocking)
{
    await Task.Yield();

    if (isBlocking)
    {
        Thread.Sleep(5000);
    }
    else
    {
        await Task.Delay(5000);
    }
}

// A test where 3 tasks are created and each of which is awaited separately.
private async Task SeparateAwait(bool isBlocking)
{
    SynchronizationContext.SetSynchronizationContext(new SimpleAspNetContext());
    
    await DoTask(isBlocking);
    await DoTask(isBlocking);
    await DoTask(isBlocking);
}

// A test where 3 tasks are created and all 3 tasks are awaited collectively.
private async Task WhenAll(bool isBlocking)
{
    SynchronizationContext.SetSynchronizationContext(new SimpleAspNetContext());
    
    var task1 = DoTask(isBlocking);
    var task2 = DoTask(isBlocking);
    var task3 = DoTask(isBlocking);
    
    await Task.WhenAll(task1, task2, task3);
    
    await task1;
    await task2;
    await task3;
}
Separate non-blocking: 00:00:15.0177434
Separate blocking: 00:00:15.0019056
WhenAll non-blocking: 00:00:05.0065793
WhenAll blocking: 00:00:15.0014165
The first two tests are made to await tasks separately, whereas the last two tests are made to await tasks collectively. Both groups are testing executing blocking and non-blocking tasks.

The result of the tests was interesting as we could see the only test where tasks were seemingly running in parallel was when all 3 tasks are non-blocking and are awaited collectively.

Back to my original intent. The code review comment was made regarding whether kicking off multiple HTTP requests and await them jointly could improve performance. As of result of the tests, for a single request, it seems like the answer is definitely yes, as HTTP requests are non-blocking tasks. However, for the entire application, the answer is still to be determined because the performance gain is a result of sacrificing server resources (e.g. TCP connections), but it has been off the topic of this post.

Comments