Imagine a producer-consumer scenario, stream A produces records, consumes them from one to many other streams.
To do this, I transfer a bunch of records to each consumer stream.
Do this, I ask myself if it is cheaper (primary in terms of processor utilization, secondary in memory):
- so that each consumer stream represents a separate instance of
HashMap . After the Map transferred to one consumer, a new Map instance will be created and used to transfer the next produced records to the next stream
or
- use one
ConcurrentHashMap and create an Iterator for each consumer stream and after passing the Iterator to the stream clearing the Map - so that each Iterator contains its own kind of base Map .
What do you think? Is a more or less general answer possible?
Or it depends heavily on some variables, such as the number of records, threads, etc.
EDIT: Or should I use some other data structure that can better solve these problems?
source share