Strange results using the dictionary <>

Dictionary <,> seems to be affected by the size of the stored element (which seems strange).

Here is my simple class:

 public class MyObject { public Guid Key { get; set; } } 

And two simple tests:

 private long _Iterations = 1000000; [TestMethod] public void ShouldTestDefaultConstructorPerformance() { for (var i = 0; i < _Iterations; i++) { var obj = new MyObject() { Key = Guid.NewGuid() }; } } [TestMethod] public void ShouldTestDefaultGuidDictionaryPerformance() { var dict = new Dictionary<Guid, MyObject>(); for (var i = 0; i < _Iterations; i++) { var obj = new MyObject() { Key = Guid.NewGuid() }; dict.Add(obj.Key, obj); } } 

Initially, I get the following timings:

 ShouldTestDefaultConstructorPerformance : 00:00:00.580 ShouldTestDefaultGuidDictionaryPerformance : 00:00:01.238 

Now I will change the MyObject class:

 public class MyObject { public Guid Key { get; set; } private Dictionary<string, string> _Property0 = new Dictionary<string, string>(); private Dictionary<string, string> _Property1 = new Dictionary<string, string>(); private Dictionary<string, string> _Property2 = new Dictionary<string, string>(); private Dictionary<string, string> _Property3 = new Dictionary<string, string>(); private Dictionary<string, string> _Property4 = new Dictionary<string, string>(); private Dictionary<string, string> _Property5 = new Dictionary<string, string>(); private Dictionary<string, string> _Property6 = new Dictionary<string, string>(); private Dictionary<string, string> _Property7 = new Dictionary<string, string>(); private Dictionary<string, string> _Property8 = new Dictionary<string, string>(); private Dictionary<string, string> _Property9 = new Dictionary<string, string>(); } 

And run the tests again:

 ShouldTestDefaultConstructorPerformance : 00:00:01.333 ShouldTestDefaultGuidDictionaryPerformance : 00:00:07.556 

In the second test, the construction of the object takes more than 1.72x, but adding a dictionary takes 6.11x more . I expected the tests to take longer, but why does the Dictionary take much longer to add large objects?

+4
source share
4 answers

I think my answer will be: use the profiler and work, which bit actually takes longer.

This will probably highlight the instance. Maybe:)

0
source

I think people need to read the questions more carefully, not rush to post the answer. If you carefully study its sample code (BOTH tests), the difference between the fact that MyObject with Guid and MyObject with Guid and 10 Dict is under the second (object construction) for its loop. However, the add dictionary contains at least 5 seconds.

+1
source

I think var obj = new MyObject() { Key = Guid.NewGuid() }; this line actually takes a lot more time than the Add() dictionary. Did you measure the internal methods?

0
source

Each object that you add to the dictionary is provided with a special unique identifier to speed up its search and retrieval in memory. This unique unique identifier (called a hash) is calculated by analyzing the entire contents of the object. The larger the object, the slower the hash calculation.

If you are interested in the subtle details of how this works, check out this example from a university course: http://www.ccs.neu.edu/home/sbratus/com1101/hash-dict.html

-2
source

Source: https://habr.com/ru/post/1368971/


All Articles