I have a matrix class that takes a generic object that extends a number.
For instance:
public class Matrix<T extends Number>
I am trying to compare two matrices that have the same values:
Matrix:
row=[0] 273 455
row=[1] 243 235
row=[2] 244 205
row=[3] 102 160
and
Matrix:
row=[0] 273 455
row=[1] 243 235
row=[2] 244 205
row=[3] 102 160
In the Matrix class, I have an equals method that looks like this:
public boolean equals(Object obj) {
if (obj == null)
return false;
if (!(obj instanceof Matrix))
return false;
Matrix<T> m = (Matrix<T>) obj;
if (this.rows != m.rows)
return false;
if (this.cols != m.cols)
return false;
for (int i=0; i<matrix.length; i++) {
T t1 = matrix[i];
T t2 = m.matrix[i];
if (!t1.equals(t2))
return false;
}
return true;
}
This line does not work:
t1.equals(t2)
even if both numbers are equal. e.g. 273 and 273
When I debug the equals method, it fails because it accepts Longs numbers:
This is from the Java SDK Long.class:
public boolean equals(Object obj) {
if (obj instanceof Long) {
return value == ((Long)obj).longValue();
}
return false;
}
Essentially, it fails because obj is not an instance of Long.
I can easily change the equals method:
if (t1.longValue()!=t2.longValue())
return false;
But I'm wondering how to verify equality in this situation and why the equals method on generic T assumes that it is long.
EDIT:
'' Matrix generic type Integer '', ( Long).
:
Matrix<Integer> matrix1 = new Matrix<Integer>(4, 3);
matrix1.set(0, 0, 14);
matrix1.set(0, 1, 9);
matrix1.set(0, 2, 3);
matrix1.set(1, 0, 2);
matrix1.set(1, 1, 11);
matrix1.set(1, 2, 15);
matrix1.set(2, 0, 0);
matrix1.set(2, 1, 12);
matrix1.set(2, 2, 17);
matrix1.set(3, 0, 5);
matrix1.set(3, 1, 2);
matrix1.set(3, 2, 3);
Matrix<Integer> matrix2 = new Matrix<Integer>(3, 2);
matrix2.set(0, 0, 12);
matrix2.set(0, 1, 25);
matrix2.set(1, 0, 9);
matrix2.set(1, 1, 10);
matrix2.set(2, 0, 8);
matrix2.set(2, 1, 5);
Matrix<Integer> result1 = new Matrix<Integer>(4,2);
result1.set(0, 0, 273);
result1.set(0, 1, 455);
result1.set(1, 0, 243);
result1.set(1, 1, 235);
result1.set(2, 0, 244);
result1.set(2, 1, 205);
result1.set(3, 0, 102);
result1.set(3, 1, 160);
Matrix<Integer> matrix3 = matrix1.multiply(matrix2);
if (!matrix3.equals(result1)) {
System.err.println("Matrix multiplication error. matrix3="+matrix3+" result1"+result1);
return false;
}
Here - equals(). equals().