Limit size of Queue<T> in .NET?

.NetCollectionsQueue

.Net Problem Overview


I have a Queue<T> object that I have initialised to a capacity of 2, but obviously that is just the capacity and it keeps expanding as I add items. Is there already an object that automatically dequeues an item when the limit is reached, or is the best solution to create my own inherited class?

.Net Solutions


Solution 1 - .Net

I've knocked up a basic version of what I'm looking for, it's not perfect but it'll do the job until something better comes along.

public class LimitedQueue<T> : Queue<T>
{
	public int Limit { get; set; }

	public LimitedQueue(int limit) : base(limit)
	{
		Limit = limit;
	}

	public new void Enqueue(T item)
	{
		while (Count >= Limit)
		{
			Dequeue();
		}
		base.Enqueue(item);
	}
}

Solution 2 - .Net

I would recommend that you pull up the C5 Library. Unlike SCG (System.Collections.Generic), C5 is programmed to interface and designed to be subclassed. Most public methods are virtual and none of the classes are sealed. This way, you won't have to use that icky "new" keyword which wouldn't trigger if your LimitedQueue<T> were cast to a SCG.Queue<T>. With C5 and using close to the same code as you had before, you would derive from the CircularQueue<T>. The CircularQueue<T> actually implements both a stack and a queue, so you can get both options with a limit nearly for free. I've rewritten it below with some 3.5 constructs:

using C5;

public class LimitedQueue<T> : CircularQueue<T>
{
    public int Limit { get; set; }

    public LimitedQueue(int limit) : base(limit)
    {
        this.Limit = limit;
    }

    public override void Push(T item)
    {
        CheckLimit(false);
        base.Push(item);
    }

    public override void Enqueue(T item)
    {
        CheckLimit(true);
        base.Enqueue(item);
    }

    protected virtual void CheckLimit(bool enqueue)
    {
        while (this.Count >= this.Limit)
        {
            if (enqueue)
            {
                this.Dequeue();
            }
            else
            {
                this.Pop();
            }
        }
    }
}

I think that this code should do exactly what you were looking for.

Solution 3 - .Net

Well I hope this class will helps You:
Internally the Circular FIFO Buffer use a Queue<T> with the specified size. Once the size of the buffer is reached, it will replaces older items with new ones.

NOTE: You can't remove items randomly. I set the method Remove(T item) to return false. if You want You can modify to remove items randomly

public class CircularFIFO<T> : ICollection<T> , IDisposable
{
    public Queue<T> CircularBuffer;

    /// <summary>
    /// The default initial capacity.
    /// </summary>
    private int capacity = 32;

    /// <summary>
    /// Gets the actual capacity of the FIFO.
    /// </summary>
    public int Capacity
    {
        get { return capacity; }          
    }

    /// <summary>
    ///  Initialize a new instance of FIFO class that is empty and has the default initial capacity.
    /// </summary>
    public CircularFIFO()
    {            
        CircularBuffer = new Queue<T>();
    }

    /// <summary>
    /// Initialize a new instance of FIFO class that is empty and has the specified initial capacity.
    /// </summary>
    /// <param name="size"> Initial capacity of the FIFO. </param>
    public CircularFIFO(int size)
    {
        capacity = size;
        CircularBuffer = new Queue<T>(capacity);
    }

    /// <summary>
    /// Adds an item to the end of the FIFO.
    /// </summary>
    /// <param name="item"> The item to add to the end of the FIFO. </param>
    public void Add(T item)
    {
        if (this.Count >= this.Capacity)
            Remove();

        CircularBuffer.Enqueue(item);
    }

    /// <summary>
    /// Adds array of items to the end of the FIFO.
    /// </summary>
    /// <param name="item"> The array of items to add to the end of the FIFO. </param>
     public void Add(T[] item)
    { 
        int enqueuedSize = 0;
        int remainEnqueueSize = this.Capacity - this.Count;
        
        for (; (enqueuedSize < item.Length && enqueuedSize < remainEnqueueSize); enqueuedSize++)
            CircularBuffer.Enqueue(item[enqueuedSize]);
        
        if ((item.Length - enqueuedSize) != 0)
        {
            Remove((item.Length - enqueuedSize));//remaining item size

            for (; enqueuedSize < item.Length; enqueuedSize++)
                CircularBuffer.Enqueue(item[enqueuedSize]);
        }           
    }
    
    /// <summary>
    /// Removes and Returns an item from the FIFO.
    /// </summary>
    /// <returns> Item removed. </returns>
    public T Remove()
    {
        T removedItem = CircularBuffer.Peek();
        CircularBuffer.Dequeue();

        return removedItem;
    }

    /// <summary>
    /// Removes and Returns the array of items form the FIFO.
    /// </summary>
    /// <param name="size"> The size of item to be removed from the FIFO. </param>
    /// <returns> Removed array of items </returns>
    public T[] Remove(int size)
    {
        if (size > CircularBuffer.Count)
            size = CircularBuffer.Count;

        T[] removedItems = new T[size];

        for (int i = 0; i < size; i++)
        {
            removedItems[i] = CircularBuffer.Peek();
            CircularBuffer.Dequeue();
        }

        return removedItems;
    }

    /// <summary>
    /// Returns the item at the beginning of the FIFO with out removing it.
    /// </summary>
    /// <returns> Item Peeked. </returns>
    public T Peek()
    {
        return CircularBuffer.Peek();
    }

    /// <summary>
    /// Returns the array of item at the beginning of the FIFO with out removing it.
    /// </summary>
    /// <param name="size"> The size of the array items. </param>
    /// <returns> Array of peeked items. </returns>
    public T[] Peek(int size)
    {
        T[] arrayItems = new T[CircularBuffer.Count];
        CircularBuffer.CopyTo(arrayItems, 0);

        if (size > CircularBuffer.Count)
            size = CircularBuffer.Count;

        T[] peekedItems = new T[size];

        Array.Copy(arrayItems, 0, peekedItems, 0, size);

        return peekedItems;
    }
            
    /// <summary>
    /// Gets the actual number of items presented in the FIFO.
    /// </summary>
    public int Count
    {
        get
        {
            return CircularBuffer.Count;
        }
    }
    
    /// <summary>
    /// Removes all the contents of the FIFO.
    /// </summary>
    public void Clear()
    {
        CircularBuffer.Clear();
    }

    /// <summary>
    /// Resets and Initialize the instance of FIFO class that is empty and has the default initial capacity.
    /// </summary>
    public void Reset()
    {
        Dispose();
        CircularBuffer = new Queue<T>(capacity);
    }

    #region ICollection<T> Members
    
    /// <summary>
    /// Determines whether an element is in the FIFO.
    /// </summary>
    /// <param name="item"> The item to locate in the FIFO. </param>
    /// <returns></returns>
    public bool Contains(T item)
    {
        return CircularBuffer.Contains(item);
    }

    /// <summary>
    /// Copies the FIFO elements to an existing one-dimensional array. 
    /// </summary>
    /// <param name="array"> The one-dimensional array that have at list a size of the FIFO </param>
    /// <param name="arrayIndex"></param>
    public void CopyTo(T[] array, int arrayIndex)
    {
        if (array.Length >= CircularBuffer.Count)
            CircularBuffer.CopyTo(array, 0);           
    }

    public bool IsReadOnly
    {
        get { return false; }
    }

    public bool Remove(T item)
    {
        return false; 
    }

    #endregion

    #region IEnumerable<T> Members

    public IEnumerator<T> GetEnumerator()
    {
       return CircularBuffer.GetEnumerator();
    }

    #endregion

    #region IEnumerable Members

    IEnumerator IEnumerable.GetEnumerator()
    {
        return CircularBuffer.GetEnumerator();
    }

    #endregion
       
    #region IDisposable Members
    
    /// <summary>
    /// Releases all the resource used by the FIFO.
    /// </summary>
    public void Dispose()
    {          
        CircularBuffer.Clear();
        CircularBuffer = null;
        GC.Collect();
    }

    #endregion
}

Solution 4 - .Net

Concurrent Solution
public class LimitedConcurrentQueue<ELEMENT> : ConcurrentQueue<ELEMENT>
{
    public readonly int Limit;

    public LimitedConcurrentQueue(int limit)
    {
        Limit = limit;
    }

    public new void Enqueue(ELEMENT element)
    {
        base.Enqueue(element);
        if (Count > Limit)
        {
            TryDequeue(out ELEMENT discard);
        }
    }
}

Note: Since Enqueue controls the addition of elements, and does so one at a time, there is no need to execute a while for TryDequeue.

Solution 5 - .Net

You should create your own class, a ringbuffer would probably fit your needs.

The data structures in .NET that allows you to specify capacity, except for array, uses this to build the internal data structure used to hold the internal data.

For instance, for a list, capacity is used to size an internal array. When you start adding elements to the list, it'll start filling this array from index 0 and up, and when it reaches your capacity, it increases the capacity to a new higher capacity, and continues filling it up.

Solution 6 - .Net

Why wouldn't you just use an array with a size of 2? A Queue is supposed to be able to dynamically grow and shrink.

Or create a wrapper class around an instance of Queue<T> instance and each time one enqueues a <T> object, check the size of the queue. If larger than 2, dequeue the first item.

Solution 7 - .Net

If it's of any use to anyone, I made a LimitedStack<T>.

public class LimitedStack<T>
{
    public readonly int Limit;
    private readonly List<T> _stack;

    public LimitedStack(int limit = 32)
    {
        Limit = limit;
        _stack = new List<T>(limit);
    }

    public void Push(T item)
    {
        if (_stack.Count == Limit) _stack.RemoveAt(0);
        _stack.Add(item);
    }

    public T Peek()
    {
        return _stack[_stack.Count - 1];
    }

    public void Pop()
    {
        _stack.RemoveAt(_stack.Count - 1);
    }

    public int Count
    {
        get { return _stack.Count; }
    }
}

It removes the oldest item (bottom of stack) when it gets too big.

(This question was the top Google result for "C# limit stack size")

Solution 8 - .Net

You can use a LinkedList<T> and add thread safety:

public class Buffer<T> : LinkedList<T>
{
	private int capacity;
	
	public Buffer(int capacity)
	{
		this.capacity = capacity;	
	}
	
	public void Enqueue(T item)
	{
		// todo: add synchronization mechanism
		if (Count == capacity) RemoveLast();
		AddFirst(item);
	}
	
	public T Dequeue()
	{
		// todo: add synchronization mechanism
		var last = Last.Value;
		RemoveLast();
		return last;
	}
}

One thing to note is the default enumeration order will be LIFO in this example. But that can be overridden if necessary.

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
Questiontags2kView Question on Stackoverflow
Solution 1 - .Nettags2kView Answer on Stackoverflow
Solution 2 - .NetMarcus GriepView Answer on Stackoverflow
Solution 3 - .NetRobel.EView Answer on Stackoverflow
Solution 4 - .NetSwiftArchitectView Answer on Stackoverflow
Solution 5 - .NetLasse V. KarlsenView Answer on Stackoverflow
Solution 6 - .NetCodingWithoutCommentsView Answer on Stackoverflow
Solution 7 - .NetmpenView Answer on Stackoverflow
Solution 8 - .NetBrandonView Answer on Stackoverflow