I am trying to write an image processing library in Clojure, but while writing tests I ran into a problem.
Image data is saved as a 2D array of integers that are signed (Java and, by extension Clojure, do not have unsigned integers). I have a function to get a pixel for a given pair of coordinates. My test of this function looks something like this:
(is (= (get-pixel image 0 (dec width)) 0xFFFF0000))
That is, if the pixel at (0, width-1) is red. The problem is that get-pixel returns a signed int, but Clojure treats 0xFFFF0000 as long. In this case, getting the pixel returns -65536 (0xFFFF0000 in hexadecimal format), and Clojure checks if it is equal to 4294901760.
Now I have several options. I could rewrite the test using an integer interpretation of the hexadecimal number as a decimal number (-65536), but I think this makes the purpose of the test less clear. I could write a function to convert a negative number into a positive long, but this is an additional level of complexity. The easiest way is to simply bitwise between the two numbers and see if it has changed, but it still seems more complicated than it should be.
Is there any built-in way to force 0xFFFF0000 to evaluate to a signed integer rather than a long one, or to perform a bitwise comparison of two arbitrary numbers? The int function does not work, since the number is too large to be represented as a signed int.
Thanks!
source share