Pre-processed animation Core Graphics does not animate smoothly and boron memory

I am posting this question in response to one of the answers to my previous question: Several CALayer masks causing performance issues

So, now when I try to go down with a predefined animation approach, I still can't get a smooth animation. Not only that, but when launched on the device itself, the application crashes periodically due to memory problems.

You can see the animation running here: http://cl.ly/e3Qu (It may not be so bad from the video, but focus on the edge of the animation and it works worse on the device itself.)

Here is my code:

static CGFloat const animationDuration = 1.5; static CGFloat const calculationRate = (1.0/40.0); // 40fps max. static CGFloat const calculationCycles = animationDuration/calculationRate; @implementation splashView { CADisplayLink* l; CGImageRef backgroundImg; UIColor* color; NSMutableArray* animationImages; NSTimeInterval currentTime; } -(void) beginAnimating { static dispatch_once_t d; dispatch_once(&d, ^{ CGFloat totalDistance = 0; CGFloat screenProgress = 0; CGFloat deltaScreenProgress = 0; totalDistance = screenHeight()+screenWidth(); color = [[lzyColors colors] randomColor]; backgroundImg = textBG(color, screenSize()).CGImage; animationImages = [NSMutableArray array]; NSLog(@"start"); UIGraphicsBeginImageContextWithOptions(screenSize(), YES, 0); CGContextRef c = UIGraphicsGetCurrentContext(); for (int i = 0; i <= (calculationCycles+1); i++) { UIImage* img = lzyCGImageFromDrawing(^{ CGFloat height = screenHeight(); CGFloat width = screenWidth(); CGMutablePathRef p = CGPathCreateMutable(); CGPoint startingPoint = [self pointBForProgress:screenProgress]; CGPathMoveToPoint(p, nil, startingPoint.x, startingPoint.y); lzyCGPathAddLineToPath(p, [self pointAForProgress:screenProgress]); if ((width < screenProgress) && (screenProgress-deltaScreenProgress) < width) { lzyCGPathAddLineToPath(p, (CGPoint){width, 0}); } if (deltaScreenProgress != 0) lzyCGPathAddLineToPath(p, [self pointAForProgress:screenProgress-deltaScreenProgress-1]); if (deltaScreenProgress != 0) lzyCGPathAddLineToPath(p, [self pointBForProgress:screenProgress-deltaScreenProgress-1]); if ((height < screenProgress) && (screenProgress-deltaScreenProgress) < height) { lzyCGPathAddLineToPath(p, (CGPoint){0, height}); } CGPathCloseSubpath(p); CGContextAddPath(c, p); CGContextClip(c); CGPathRelease(p); CGContextSetFillColorWithColor(c, color.CGColor); CGContextFillRect(c, self.bounds); CGContextDrawImage(c, self.bounds, backgroundImg); }); [animationImages addObject:img]; deltaScreenProgress = screenProgress; screenProgress = (i*totalDistance)/calculationCycles; deltaScreenProgress = screenProgress-deltaScreenProgress; } NSLog(@"stop"); currentTime = 0; l = [CADisplayLink displayLinkWithTarget:self selector:@selector(displayLinkDidFire)]; [l addToRunLoop:[NSRunLoop mainRunLoop] forMode:NSRunLoopCommonModes]; }); } -(void) displayLinkDidFire { NSTimeInterval deltaTime = l.duration; currentTime += deltaTime; if (currentTime <= animationDuration) { CGFloat prg = (currentTime/animationDuration); NSInteger image = roundf(([animationImages count]-1)*prg); [CATransaction begin]; [CATransaction setDisableActions:YES]; self.layer.contents = (__bridge id _Nullable)(((UIImage*)[animationImages objectAtIndex:image]).CGImage); [CATransaction commit]; } else { [CATransaction begin]; [CATransaction setDisableActions:YES]; self.layer.contents = (__bridge id _Nullable)(((UIImage*)[animationImages lastObject]).CGImage); [CATransaction commit]; [l invalidate]; animationImages = nil; } } -(CGPoint) pointAForProgress:(CGFloat)progressVar { CGFloat width = screenWidth(); return (CGPoint){(progressVar<width)?progressVar:width+1, (progressVar>width)?progressVar-width:-1}; } -(CGPoint) pointBForProgress:(CGFloat)progressVar { CGFloat height = screenHeight(); return (CGPoint){(progressVar>height)?(progressVar-height):-1, (progressVar<height)?progressVar:height+1}; } @end 

The textBG() function simply performs a fairly simple Core Graphics to get a background image.

I can only assume that I am doing something fundamentally wrong here, but I canโ€™t think of what it is.

Any suggestions for improving performance and reducing memory consumption (without impairing the quality of the animation)?

+1
source share
2 answers

Animating a full-screen image through the contents of a layer will certainly have performance and memory problems, especially on @ 3x devices. For the animation you are showing in another question ( this video ), it doesnโ€™t seem like you really need any kind of masking - create a series of rectangular solid colored layers (black, light purple, medium purple, dark purple), overlay them from front to back (with a text layer between light and medium), rotate them to the desired angle and move them as needed.

If you need more complex animation for which this approach will not work - or even to animate arbitrary full-screen content - you need to either (1) pre-render it as a video (either offline or using the AVFoundation API) and play it like this way or (2) use OpenGL or Metal for drawing.

+1
source

You have some poorly written animation logic, it basically allocates a whole bunch of images in memory in such a way that, most likely, sooner or later your device crashes. You need to start with a better approach that does not pull all the image data into memory at the same time. Don't just try to tweak what you already have, because the basic assumptions made by the previous developer are simply incorrect. See my previous example for a similar question for some good links, such as "video-and-memory-use-on-ios-devices": fooobar.com/questions/1236037 / ...

0
source

Source: https://habr.com/ru/post/1236016/


All Articles