I would have a member of the cache that stores the last access time. When an element gets access (a member function is called), the access time member is updated. When the cache is full, the item with the least access time will be deleted.
Another option is to have cache elements in an obsessive list . When something is available and not on the list, it is on the list. When the cache is full, the bottom of the list is erased. More work on each access, but the victim is faster to find.
The main idea is not to have typical maps and lists for such tasks, they will fragment your memory. My algorithms constantly store the cache in one place.
source share