This is what I use in my unit tests to compare images. Unlike other methods (e.g. UIImagePNGRepresentation ), it works even if images have a different color space (e.g. RGB and grayscale).
@implementation UIImage (HPIsEqualToImage) - (BOOL)hp_isEqualToImage:(UIImage*)image { NSData *data = [image hp_normalizedData]; NSData *originalData = [self hp_normalizedData]; return [originalData isEqualToData:data]; } - (NSData*)hp_normalizedData { const CGSize pixelSize = CGSizeMake(self.size.width * self.scale, self.size.height * self.scale); UIGraphicsBeginImageContext(pixelSize); [self drawInRect:CGRectMake(0, 0, pixelSize.width, pixelSize.height)]; UIImage *drawnImage = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); return UIImagePNGRepresentation(drawnImage); } @end
This is not very efficient, so I would recommend not using it in production code unless performance is an issue.
hpique May 18 '14 at 17:53 2014-05-18 17:53
source share