Option 1: Do not care . If you look in the java HashSet implementation, you will find that it just uses the HashMap internally:
public class HashSet<E> extends AbstractSet<E> implements Set<E>, Cloneable, java.io.Serializable { static final long serialVersionUID = -5024744406713321676L; private transient HashMap<E,Object> map; ....
This is a quick implementation, however, each set entry has a reference to a value that is not required. Consequently, memory consumption. My first option is βdon't care,β since I hope that someday in the future, someone will provide an improved HashSet in the JDK. Software engineers should always have hope and a positive attitude :)
As part of the usual programming logic, I always adhere to the most accessible standards and use what is available. This avoids the effect that each programmer uses his own "implementation of the selected set" or, even worse, conducts lengthy research on what is actually the best HashSet implementation to use;)
Does Oracle have an open big code for poor HashMap? Cannot find ...
Option 2: Care . If you do not use the value of business logic, but as part of some technical middleware code, performance can make a difference. Then there are various options. CompactHashMap in Google Guava is one. Another good library is High Performance Primitive Collections . In HPPC, you will also find kits for each primitive type. I think you will also find other things that suit your specific purpose. Not every HashMap replacement can have the same semantics as the original HashMap.
So, I personally never replaced java.util.HashMap with just the "default".
source share