In C# would it be better to use Queue.Synchronized or lock() for thread safety?

C#MultithreadingQueue

C# Problem Overview


I have a Queue object that I need to ensure is thread-safe. Would it be better to use a lock object like this:

lock(myLockObject)
{
//do stuff with the queue
}

Or is it recommended to use Queue.Synchronized like this:

Queue.Synchronized(myQueue).whatever_i_want_to_do();

From reading the MSDN docs it says I should use Queue.Synchronized to make it thread-safe, but then it gives an example using a lock object. From the MSDN article:

> To guarantee the thread safety of the > Queue, all operations must be done > through this wrapper only. > > Enumerating through a collection is > intrinsically not a thread-safe > procedure. Even when a collection is > synchronized, other threads can still > modify the collection, which causes > the enumerator to throw an exception. > To guarantee thread safety during > enumeration, you can either lock the > collection during the entire > enumeration or catch the exceptions > resulting from changes made by other > threads.

If calling Synchronized() doesn't ensure thread-safety what's the point of it? Am I missing something here?

C# Solutions


Solution 1 - C#

Personally I always prefer locking. It means that you get to decide the granularity. If you just rely on the Synchronized wrapper, each individual operation is synchronized but if you ever need to do more than one thing (e.g. iterating over the whole collection) you need to lock anyway. In the interests of simplicity, I prefer to just have one thing to remember - lock appropriately!

EDIT: As noted in comments, if you can use higher level abstractions, that's great. And if you do use locking, be careful with it - document what you expect to be locked where, and acquire/release locks for as short a period as possible (more for correctness than performance). Avoid calling into unknown code while holding a lock, avoid nested locks etc.

In .NET 4 there's a lot more support for higher-level abstractions (including lock-free code). Either way, I still wouldn't recommend using the synchronized wrappers.

Solution 2 - C#

There's a major problem with the Synchronized methods in the old collection library, in that they synchronize at too low a level of granularity (per method rather than per unit-of-work).

There's a classic race condition with a synchronized queue, shown below where you check the Count to see if it is safe to dequeue, but then the Dequeue method throws an exception indicating the queue is empty. This occurs because each individual operation is thread-safe, but the value of Count can change between when you query it and when you use the value.

object item;
if (queue.Count > 0)
{
    // at this point another thread dequeues the last item, and then
    // the next line will throw an InvalidOperationException...
    item = queue.Dequeue();
}

You can safely write this using a manual lock around the entire unit-of-work (i.e. checking the count and dequeueing the item) as follows:

object item;
lock (queue)
{
    if (queue.Count > 0)
    {
        item = queue.Dequeue();
    }
}

So as you can't safely dequeue anything from a synchronized queue, I wouldn't bother with it and would just use manual locking.

.NET 4.0 should have a whole bunch of properly implemented thread-safe collections, but that's still nearly a year away unfortunately.

Solution 3 - C#

There's frequently a tension between demands for 'thread safe collections' and the requirement to perform multiple operations on the collection in an atomic fashion.

So Synchronized() gives you a collection which won't smash itself up if multiple threads add items to it simultaneously, but it doesn't magically give you a collection that knows that during an enumeration, nobody else must touch it.

As well as enumeration, common operations like "is this item already in the queue? No, then I'll add it" also require synchronisation which is wider than just the queue.

Solution 4 - C#

This way we don't need to lock the queue just to find out it was empty.

object item;
if (queue.Count > 0)
{
  lock (queue)
  {
    if (queue.Count > 0)
    {
       item = queue.Dequeue();
    }
  }
}

Solution 5 - C#

It seems clear to me that using a lock(...) {...} lock is the right answer.

>To guarantee the thread safety of the Queue, all operations must be done through this wrapper only.

If other threads access the queue without using .Synchronized(), then you'll be up a creek - unless all your queue access is locked up.

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionJon TackaburyView Question on Stackoverflow
Solution 1 - C#Jon SkeetView Answer on Stackoverflow
Solution 2 - C#Greg BeechView Answer on Stackoverflow
Solution 3 - C#Will DeanView Answer on Stackoverflow
Solution 4 - C#LeonidiusView Answer on Stackoverflow
Solution 5 - C#Greg HurlmanView Answer on Stackoverflow