NSImage from an array of 1D pixels?

I have a large 1D dynamic array in my program that represents a FITS image on disk, that is, it contains all the pixel values ​​of the image. The array type is double . At the moment, I'm only interested in monochrome images.

Since Cocoa does not support the FITS format directly, I read images using the CFITSIO library. This works - I can manipulate the array as I wish and save the result to disk using the library.

However, now I want to display the image. I guess this can be done by NSImage or NSView. But class references do not seem to list a method that could take an array of C and ultimately return an NSImage object. The closest I found was -initWithData: (NSData *) . But I'm not 100% sure if this is what I need.

Am I barking the wrong tree here? Any pointers to a class or method that could handle this would be

EDIT:

Here's the updated code. Please note: I set each pixel to 0xFFFF. This only results in a gray image. This, of course, is just a test. When loading the actual FITS file, I replace 0xFFFF with imageArray[i * width + j] . This works fine in 8 bits (of course, I divide each pixel value by 256 to represent it in 8 bits).

  NSBitmapImageRep *greyRep = [[NSBitmapImageRep alloc] initWithBitmapDataPlanes:nil pixelsWide:width pixelsHigh:height bitsPerSample:16 samplesPerPixel:1 hasAlpha:NO isPlanar:NO colorSpaceName:NSCalibratedWhiteColorSpace bytesPerRow:0 bitsPerPixel:16]; NSInteger rowBytes = [greyRep bytesPerRow]; unsigned short*pix = (unsigned short*)[greyRep bitmapData]; NSLog(@"Row Bytes: %d",rowBytes); if(temp.bitPix == 16) // 16 bit image { for(i=0;i<height;i++) { for(j=0;j<width;j++) { pix[i * rowBytes + j] = 0xFFFF; } } } 

I also tried using Quartz2D directly. This creates the correct image, even in 16 bits. But strangely, the data array accepts 0xFF as white, not 0xFFFF. Therefore, I still need to divide everything into 0xFF - data loss in the process. Quartz2D Code:

  short* grey = (short*)malloc(width*height*sizeof(short)); for(int i=0;i<width*height; i++) { grey[i] = imageArray[i]; } CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceGray(); CGContextRef bitmapContext = CGBitmapContextCreate(grey, width, height, 16, width*2, colorSpace, kCGImageAlphaNone); CFRelease(colorSpace); CGImageRef cgImage = CGBitmapContextCreateImage(bitmapContext); NSImage *greyImage = [[NSImage alloc] initWithCGImage:cgImage size:NSMakeSize(width, height)]; 

Any suggestions?

+2
source share
3 answers

initWithData only works for image types that the system already knows about. For unknown types - and raw pixel data - you need to create your own image representation. You can do this through Core Graphics, as suggested in the answer referenced by Kirby. Alternatively, you can use NSImage by creating and adding NSBitmapImageRep .

The exact details will depend on the format of your pixel data, but here is an example of a process for grayscale images, where the source data ( samples array) are presented as double in the range [0,1]

 /* generate a greyscale image representation */ NSBitmapImageRep *greyRep = [[NSBitmapImageRep alloc] initWithBitmapDataPlanes: nil // allocate the pixel buffer for us pixelsWide: xDim pixelsHigh: yDim bitsPerSample: 8 samplesPerPixel: 1 hasAlpha: NO isPlanar: NO colorSpaceName: NSCalibratedWhiteColorSpace // 0 = black, 1 = white in this color space bytesPerRow: 0 // passing 0 means "you figure it out" bitsPerPixel: 8]; // this must agree with bitsPerSample and samplesPerPixel NSInteger rowBytes = [greyRep bytesPerRow]; unsigned char* pix = [greyRep bitmapData]; for ( i = 0; i < yDim; ++i ) { for ( j = 0; j < xDim; ++j ) { pix[i * rowBytes + j] = (unsigned char)(255 * (samples[i * xDim + j])); } } NSImage* greyscale = [[NSImage alloc] initWithSize:NSMakeSize(xDim,yDim)]; [greyscale addRepresentation:greyRep]; [greyRep release]; 

EDIT (in response to comment)

I did not know for sure whether 16-bit samples were supported, but you seem to have confirmed that they are.

What you see is still processing the pixels as an unsigned char , which is 8 bits. Thus, you set only half of each row, and you set each of these pixels one byte for two bytes 0xFF00 - not quite white, but very close. The other half of the image did not touch, but would be initialized to 0, so it would remain black.

You need to work 16 bits instead, first running the value returned from rep:

 unsigned short * pix = (unsigned short*) [greyRep bitmapData]; 

And then assigning 16 bits to the pixels:

 if ( j % 2 ) { pix[i * rowBytes + j] = 0xFFFF; } else { pix[i * rowBytes + j] = 0; } 

Scratch that rowBytes is in bytes, so we need to stick with an unsigned char for pix and use it when assigning, which is a bit uglier:

 if ( j % 2 ) { *((unsigned short*) (pix + i * rowBytes + j * 2)) = 0xFFFF; } else { *((unsigned short*) (pix + i * rowBytes + j * 2)) = 0; } 

(I switched the order of the sentences because == 0 seemed redundant. Actually, for something like that, it would be a lot neater to use the ?: Syntax, but that’s enough C futzing.)

+3
source

Here is a solution with your code. This has been expanded to support all three channels. Each channel has 16 bits. Note that I assign imageArray [i] for each channel. This is only because I currently have not written code that can read FITS color files, therefore, to verify that I simply assign an image to each channel. The result, of course, is a grayscale image on the screen. But if you want, it can be easily modified so that Red is assigned to Red, etc.

 NSBitmapImageRep *colorRep = [[NSBitmapImageRep alloc] initWithBitmapDataPlanes:nil pixelsWide:width pixelsHigh:height bitsPerSample:16 samplesPerPixel:3 hasAlpha:NO isPlanar:NO colorSpaceName:NSCalibratedRGBColorSpace bytesPerRow:(3*2*width) bitsPerPixel:48]; rowBytes = [greyRep bytesPerRow]; NSLog(@"Row Bytes: %d",rowBytes); pix = [colorRep bitmapData]; for(i=0;i<height*width;++i) { *((unsigned short*)(pix + 6*i)) = imageArray[i]; *((unsigned short*)(pix + 6*i + 2)) = imageArray[i]; *((unsigned short*)(pix + 6*i + 4)) = imageArray[i]; } NSImage *theImage = [[NSImage alloc] initWithSize:NSMakeSize(width, height)]; [greyScale addRepresentation:colorRep]; [myimageView setImage:theImage]; 
+1
source

Adapt the response from Converting RGB data to a bitmap in Objective-C ++ Cocoa to your data.

0
source

Source: https://habr.com/ru/post/1394471/


All Articles