Options for designing links in the cache stream when evicting old records

I am trying to create a simple cache that follows the following rules:

  • Records have a unique key.
  • When the number of entries in the cache exceeds a certain limit, older elements are crowded out (so that the cache is not too large).
  • Each data record is unchanged until the record is deleted from the cache.
  • The reader can access the cache entry, and the entry must be valid for the reader's lifetime.
  • Each reader can be in its stream, and all readers refer to the same cache instance.

Thread safety in this cache is important because we do not want readers to contain a link to a post, only to pull it from another thread to another place.

Therefore, my current implementation simply copies the entire record when reading from the cache. This is normal for small objects, but as soon as the objects become too large, too many copies occur. It is also not so good with a large number of readers who access the same cache entry.

Since the data is immutable, it would be great if each reader in the same message could simply contain a link instead of a copy, but in some streaming form (so that it does not fail).

The previous implementation used reference counting to achieve this ... but it is very difficult with threads, and I went with this simpler approach.

Are there any other templates / ideas that I could use to improve this design?

+3
6

( ), , , .

, - , , - , . , , - , , , .

, , . , - - , , , , - , , "".

, - , ( ). , , , , "" , , . , , .

() , /, , .

( ), , , . , -, , . , CAS ( ) (128 64- ), .

- - " ". . , , - . - "" . , , . , "enter" "" , . , , , , , , , .

, "enter" "" ( , ), "enter" "" . ! , - - .

, , .

+2

, , / . , . ( , ). - ( ), .

, ( ). , - concurrency. .

+2

std:: map, .

0

, , . , inc/dec, .

0

, , / , . , , .

0

You have a circular queue and do not allow multiple threads to write to it or the cache will be useless. Each thread should have its own cache, possibly with read access to other caches, but not with write access.

0
source

Source: https://habr.com/ru/post/1729962/


All Articles