Over the past 4-5 hours, I have been struggling with this very strange problem. I have an array of bytes that contain pixel values ββfrom which I would like to make an image. An array represents 32 bits per component. There is no alpha channel, so the image is 96 bits / pixel.
I pointed all this to the CGImageCreate function as follows:
CGImageRef img = CGImageCreate(width, height, 32, 96, bytesPerRow, space, kCGImageAlphaNone , provider, NULL, NO, kCGRenderingIntentDefault);
bytesPerRow - 3*width*4 . This is because there are 3 components per pixel, and each component takes 4 bytes (32 bits). Thus, the total bytes per line are 3 * 4 * widths. The data provider is defined as follows:
CGDataProviderRef provider = CGDataProviderCreateWithData(NULL,bitmapData,3*4*width*height,NULL);
Everything is getting weird here. In my array, I set the values ββto 0x000000FF (for all 3 channels), and yet the image comes out completely white. If I set the value to 0xFFFFFF00, the image will turn black. This tells me that for some reason the program does not read all 4 bytes for each component and instead reads the least significant byte. I tried all kinds of combinations - even including the alpha channel, but it did not matter.
The program is blind to this: 0xAAAAAA00. It just reads it as 0. When I explain that the bits per component are 32 bits, should this not be taken into account and actually read 4 bytes from the array?
An array of bytes is defined as: bitmapData = (char*)malloc(bytesPerRow*height); And assign values ββto the array as follows
for(i=0;i<width*height;i++) { *((unsigned int *)(bitmapData + 12*i + 0)) = 0xFFFFFF00; *((unsigned int *)(bitmapData + 12*i + 4)) = 0xFFFFFF00; *((unsigned int *)(bitmapData + 12*i + 8)) = 0xFFFFFF00; }
Note that I am addressing the array as an int to address 4 bytes of memory. I multiply by 12 because 12 bytes per pixel. Adding 4 and 8 allows the loop to access the green and blue channels. Note that I checked the memory of the array in the debugger, and it seems that everything is in order. The cycle is written in 4 bytes. Any pointers to this would be helpful. My ultimate goal is to read 32-bit FITS files for which I already have a program written. I am only testing the above code with the above array.
Here's the code as a whole, if that matters. This is the drawRect:(NSRect)dirtyRect my user view:
int width, height, bytesPerRow; int i; width = 256; height = 256; bytesPerRow = 3*width*4; char *bitmapData; bitmapData = (char*)malloc(bytesPerRow*height); for(i=0;i<width*height;i++) { *((unsigned int *)(bitmapData + 12*i + 0)) = 0xFFFFFF00; *((unsigned int *)(bitmapData + 12*i + 4)) = 0xFFFFFF00; *((unsigned int *)(bitmapData + 12*i + 8)) = 0xFFFFFF00; } CGDataProviderRef provider = CGDataProviderCreateWithData(NULL,bitmapData,3*4*width*height,NULL); CGColorSpaceRef space = CGColorSpaceCreateDeviceRGB(); CGImageRef img = CGImageCreate(width, height, 32, 96, bytesPerRow, space, kCGImageAlphaNone, provider, NULL, NO, kCGRenderingIntentDefault); CGColorSpaceRelease(space); CGDataProviderRelease(provider); CGContextRef theContext = [[NSGraphicsContext currentContext] graphicsPort]; CGContextDrawImage(theContext, CGRectMake(0,0,width,height), img);