Is there a way to capture CAEmitterCells (generated using CAEmitterLayer) when capturing the screen of an ios device?
UIGetScreenImage () works, but since they are not allowed to use the private method for it.
UIGraphicsBeginImageContext does not seem to work, particles are simply skipped from the resulting image.
EDIT: Here is the code I'm currently using to capture the view. I actually record video from a screen lasting 30 seconds using the code provided by aroth on here . It works by recording 25 images of itself (its subclass of UIView) and its subtype (in our case, including UIView, whose layer is CAEmitterLayer) per second and uses AVAssetWriter to compose the recording. This is quite a lot, so I just put the relevant lines here: I ARC-ed code using the ARC tool in Xcode, so the code can be a slightly different way of managing memory.
- (CGContextRef) createBitmapContextOfSize:(CGSize) size { CGContextRef context = NULL; CGColorSpaceRef colorSpace; int bitmapByteCount; int bitmapBytesPerRow; bitmapBytesPerRow = (size.width * 4); bitmapByteCount = (bitmapBytesPerRow * size.height); colorSpace = CGColorSpaceCreateDeviceRGB(); if (bitmapData != NULL) { free(bitmapData); } bitmapData = malloc( bitmapByteCount ); if (bitmapData == NULL) { fprintf (stderr, "Memory not allocated!"); return NULL; } context = CGBitmapContextCreate (bitmapData, size.width, size.height, 8,
Hope this helps. Thanks for your support!
source share