IOS: capturing CAEmitterLayer particles on screen

Is there a way to capture CAEmitterCells (generated using CAEmitterLayer) when capturing the screen of an ios device?
UIGetScreenImage () works, but since they are not allowed to use the private method for it.
UIGraphicsBeginImageContext does not seem to work, particles are simply skipped from the resulting image.

EDIT: Here is the code I'm currently using to capture the view. I actually record video from a screen lasting 30 seconds using the code provided by aroth on here . It works by recording 25 images of itself (its subclass of UIView) and its subtype (in our case, including UIView, whose layer is CAEmitterLayer) per second and uses AVAssetWriter to compose the recording. This is quite a lot, so I just put the relevant lines here: I ARC-ed code using the ARC tool in Xcode, so the code can be a slightly different way of managing memory.

- (CGContextRef) createBitmapContextOfSize:(CGSize) size { CGContextRef context = NULL; CGColorSpaceRef colorSpace; int bitmapByteCount; int bitmapBytesPerRow; bitmapBytesPerRow = (size.width * 4); bitmapByteCount = (bitmapBytesPerRow * size.height); colorSpace = CGColorSpaceCreateDeviceRGB(); if (bitmapData != NULL) { free(bitmapData); } bitmapData = malloc( bitmapByteCount ); if (bitmapData == NULL) { fprintf (stderr, "Memory not allocated!"); return NULL; } context = CGBitmapContextCreate (bitmapData, size.width, size.height, 8, // bits per component bitmapBytesPerRow, colorSpace, kCGImageAlphaNoneSkipFirst); CGContextSetAllowsAntialiasing(context,NO); if (context== NULL) { free (bitmapData); fprintf (stderr, "Context not created!"); return NULL; } CGColorSpaceRelease( colorSpace ); return context; } //static int frameCount = 0; //debugging - (void) drawRect:(CGRect)rect { NSDate* start = [NSDate date]; CGContextRef context = [self createBitmapContextOfSize:self.frame.size]; //not sure why this is necessary...image renders upside-down and mirrored CGAffineTransform flipVertical = CGAffineTransformMake(1, 0, 0, -1, 0, self.frame.size.height); CGContextConcatCTM(context, flipVertical); [self.layer renderInContext:context]; CGImageRef cgImage = CGBitmapContextCreateImage(context); UIImage* background = [UIImage imageWithCGImage: cgImage]; CGImageRelease(cgImage); self.currentScreen = background; //debugging //if (frameCount < 40) { // NSString* filename = [NSString stringWithFormat:@"Documents/frame_%d.png", frameCount]; // NSString* pngPath = [NSHomeDirectory() stringByAppendingPathComponent:filename]; // [UIImagePNGRepresentation(self.currentScreen) writeToFile: pngPath atomically: YES]; // frameCount++; //} //NOTE: to record a scrollview while it is scrolling you need to implement your UIScrollViewDelegate such that it calls // 'setNeedsDisplay' on the ScreenCaptureView. if (_recording) { float millisElapsed = [[NSDate date] timeIntervalSinceDate:startedAt] * 1000.0; [self writeVideoFrameAtTime:CMTimeMake((int)millisElapsed, 1000)]; } float processingSeconds = [[NSDate date] timeIntervalSinceDate:start]; float delayRemaining = (1.0 / self.frameRate) - processingSeconds; CGContextRelease(context); //redraw at the specified framerate [self performSelector:@selector(setNeedsDisplay) withObject:nil afterDelay:delayRemaining > 0.0 ? delayRemaining : 0.01]; } 

Hope this helps. Thanks for your support!

+6
source share
1 answer

Have you tried to use -[CALayer renderInContext:] ?

0
source

Source: https://habr.com/ru/post/922056/


All Articles