.NET: Scalability of a Common Dictionary

I use Dictionary<>bazillion to store items. Is it possible to assume that as long as the server memory has enough space to accommodate these bazillion elements, I will get about O (1) extraction of elements from it? What should I know about using a shared dictionary as a huge cache when performance is important?

EDIT: Shouldn't I rely on default implementations? What does a good hashing function do?

+3
source share
4 answers

It all depends on how good the hashing functionality that your "bazillion items" supports - if their hashing function is not excellent (so many conflicts lead to results), your performance will deteriorate as the dictionary grows.

+12
source

You must measure it and find out. This is someone who knows the exact use of your vocabulary, so you can measure it to make sure it matches your needs.

: , , . , , , . , .., . , ; , , , .

; , bazillion-item, , , , , .

+8

, O (1). g O (1). , , , GetHashCode , , Equals.

, : , , "" -.

+3

Yes, you will have about O (1) no matter how many objects you put in the dictionary. But for the dictionary to be fast, your key objects must provide a sufficient implementation of GetHashCode, because the dictionary uses a hash table inside itself.

+1
source

Source: https://habr.com/ru/post/1720979/


All Articles