C # collections optimized for inserting index 0?

Which C # collections are best optimized for inserting a null index as well as adding to the end?

I think it's simple LinkedList, but given my interest in large byte buffers, I suspect that the cost of memory per node may be overly expensive for my needs.

I understand that normal Listhas buffer support with a capacity that is usually larger than the actual data. When the data fills the buffer, a new buffer is created, and the contents of the old are transferred to the new one. This is great for adding to the end, but terrible for growth in the beginning. My tests, adding / inserting a million random numbers to List:

  • List Add time 0.014 sec.
  • List of Zero-Insert time 173.343sec
+4
source share
5 answers

My decision, as Eric Lippert mentioned; keep real data floating in the middle of the support buffer, not at the beginning. However, this is still a list, not a queue; You can still add / remove / replace everywhere.

public class BList<T> : IList<T>, IReadOnlyList<T>
{
    private const int InitialCapacity = 16;
    private const int InitialOffset = 8;

    private int _size;
    private int _offset;
    private int _capacity;
    private int _version;

    private T[] _items;

    public BList()
    {
        _version = 0;
        _size = 0;
        _offset = InitialOffset;
        _capacity = InitialCapacity;
        _items = new T[InitialCapacity];
    }

    public BList(int initialCapacity)
    {
        _size = 0;
        _version = 0;
        _offset = initialCapacity/2;
        _capacity = initialCapacity;
        _items = new T[initialCapacity];
    }

    public void Insert(int insertIndex, T item)
    {
        if (insertIndex < 0)
            throw new ArgumentOutOfRangeException(nameof(insertIndex));

        var padRight = Math.Max(0, (insertIndex == 0) ? 0 : (insertIndex + 1) - _size);
        var padLeft = insertIndex == 0 ? 1 : 0;

        var requiresResize = _offset - padLeft <= 0 ||
                                _offset + _size + padRight >= _capacity;
        if (requiresResize)
        {
            var newSize = _size + 1;
            var newCapacity = Math.Max(newSize, _capacity * 2);
            var newOffset = (newCapacity / 2) - (newSize / 2) - padLeft;
            var newItems = new T[newCapacity];

            Array.Copy(_items, _offset, newItems, newOffset, insertIndex);
            Array.Copy(_items, _offset, newItems, newOffset + 1, _size - insertIndex);

            newItems[newOffset + insertIndex] = item;

            _items = newItems;
            _offset = newOffset;
            _size = newSize;
            _capacity = newCapacity;
        }
        else
        {
            if (insertIndex == 0)
                _offset = _offset - 1;
            else if (insertIndex < _size)
                Array.Copy(_items, _offset + insertIndex, _items, _offset + insertIndex + 1, _size - insertIndex);

            _items[_offset + insertIndex] = item;

            _size = _size + 1;
        }

        _version++;
    }

Full code

Test insert Quantity: 131072

enter image description here

+2
source

The data structure, which is in the form of a list, but optimized for insertion and deletion at both ends, but not in the middle, is called deque, which is short for "Double Ended QUEue". I don't know if the standard deque library these days contains.

If you are interested in methods for constructing immutable requirements, I suggest you read in increasing complexity:

mutable deque, , , . - , " " . deque, , , , . , , , .

+6

IList<T>, : , ( ), , ( ). , O (1) ( , ).

public class TwoEndedList<T> : IList<T>
{
    private readonly List<T> front = new List<T>();
    private readonly List<T> back = new List<T>();

    public void Add(T item)
    {
        back.Add(item);
    }

    public void Insert(int index, T item)
    {
        if (index == 0)
            front.Add(item);
        else if (index < front.Count)
            front.Insert(front.Count - index, item);
        else
            back.Insert(index - front.Count, item);
    }

    public IEnumerator<T> GetEnumerator()
    {
        return front.Reverse<T>().Concat(back).GetEnumerator();
    }

    // rest of implementation
}
+3

( ) , , .

, , ( ) . .

- , , X. . 0, ( ) , " ".

" ". , , .

+1

, , - , . , enqueueing/dequeuing/peeking .., , , :

IDoubleEndedQueue

DoubleEndedQueue

ConcurrentDoubleEndedQueue ( )

, deque , / ( , , / 0 ).

0

Source: https://habr.com/ru/post/1626375/


All Articles