Cocoa: View NSImage Pixel Color

I have an NSImage . I would like to read NSColor for a pixel for some x and y. Xcode seems to believe that there is a colorAtX:y: method on NSImage , but it crashes saying that there is no such method for NSImage. I saw some examples when you create an NSBitmapImageRep and call the same method, but I could not successfully convert my NSImage to NSBitmapImageRep . For some reason, the pixels on NSBitmapImageRep are different in different ways.

There should be an easy way to do this. It can not be difficult.

+4
source share
2 answers

Without seeing your code, it's hard to understand what is going wrong.

You can draw the image on NSBitmapImageRep using the initWithData: and pass the image to TIFFRepresentation .

Then you can get the pixel value using the colorAtX:y: method, which is the NSBitmapImageRep method, not NSImage :

 NSBitmapImageRep* imageRep = [[NSBitmapImageRep alloc] initWithData:[yourImage TIFFRepresentation]]; NSSize imageSize = [yourImage size]; CGFloat y = imageSize.height - 100.0; NSColor* color = [imageRep colorAtX:100.0 y:y]; [imageRep release]; 

Note that you must make the setting for the y value because the colorAtX:y method uses the coordinate system that starts in the upper left corner of the image, while the NSImage coordinate NSImage starts in the lower left corner.

Alternatively, if a pixel is displayed on the screen, you can use the NSReadPixel() function to get the color of the pixel in the current coordinate system.

+9
source

The colorAtX of NSBitmapImageRep function does not seem to use the device's color space, which can lead to color values ​​that are slightly different from the values ​​you actually see. Use this code to get the correct color in the current color space of the device:

 [yourImage lockFocus]; // yourImage is just your NSImage variable NSColor *pixelColor = NSReadPixel(NSMakePoint(1, 1)); // Or another point [yourImage unlockFocus]; 
+1
source

Source: https://habr.com/ru/post/1401673/


All Articles