I am working on a project where many classes require the right typical implementations of equals and hashCode : each class has a set of finite fields initialized when building with "deep" immutable objects ( null are intended to be accepted in some cases) that will be used for hashing and comparison.
To reduce the amount of template code, I thought about writing an abstract class that provides common implementations of this behavior.
public abstract class AbstractHashable { private final Object[] fields; private final int hash; protected AbstractHashable(Object... fields) { this.fields = fields; hash = 31 * getClass().hashCode() + Objects.hash(fields); } @Override public boolean equals(Object obj) { if (obj == this) { return true; } if (obj == null || !getClass().equals(obj.getClass())) { return false; } AbstractHashable other = (AbstractHashable) obj; if (fields.length != other.fields.length) { throw new UnsupportedOperationException( "objects of same class must have the same number of fields"); } for (int i=0; i<fields.length; i++) { if (!fields[i].equals(other.fields[i])) { return false; } } return true; } @Override public int hashCode() { return hash; } }
It is intended to be used as follows:
public class SomeObject extends AbstractHashable {
Basically this offer is here with some differences.
This seems like a simple but good approach to reducing verbosity and still has efficient implementations of equals and hashCode . However, since I donβt remember ever seeing something like this (with the exception of the answer mentioned above), I would like to specifically ask if there is any point against this approach (or perhaps some improvement that could be applied) before applying it throughout the project.
source share