IPhone video filtering is slow

I am trying to filter video in iPhone. Here is my program structure and source code:

AppDelegate.h AppDelegate.m ViewController.h ViewController.m 

The AppDelegate file is the same as the default. Here is my ViewController.

 //ViewController.h #import <UIKit/UIKit.h> #import <GLKit/GLKit.h> #import <AVFoundation/AVFoundation.h> #import <CoreMedia/CoreMedia.h> #import <CoreVideo/CoreVideo.h> #import <QuartzCore/QuartzCore.h> #import <CoreImage/CoreImage.h> #import <ImageIO/ImageIO.h> @interface ViewController : GLKViewController <AVCaptureVideoDataOutputSampleBufferDelegate>{ AVCaptureSession *avCaptureSession; CIContext *coreImageContext; CIImage *maskImage; CGSize screenSize; CGContextRef cgContext; GLuint _renderBuffer; float scale; } @property (strong, nonatomic) EAGLContext *context; -(void)setupCGContext; @end // ViewController.m #import "ViewController.h" @implementation ViewController @synthesize context; - (void)viewDidLoad { [super viewDidLoad]; self.context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2]; if (!self.context) { NSLog(@"Failed to create ES context"); } GLKView *view = (GLKView *)self.view; view.context = self.context; view.drawableDepthFormat = GLKViewDrawableDepthFormat24; coreImageContext = [CIContext contextWithEAGLContext:self.context]; glGenRenderbuffers(1, &_renderBuffer); glBindRenderbuffer(GL_RENDERBUFFER, _renderBuffer); NSError *error; AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error]; AVCaptureVideoDataOutput *dataOutput = [[AVCaptureVideoDataOutput alloc] init]; [dataOutput setAlwaysDiscardsLateVideoFrames:YES]; [dataOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]]; [dataOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()]; avCaptureSession = [[AVCaptureSession alloc] init]; [avCaptureSession beginConfiguration]; [avCaptureSession setSessionPreset:AVCaptureSessionPreset1280x720]; [avCaptureSession addInput:input]; [avCaptureSession addOutput:dataOutput]; [avCaptureSession commitConfiguration]; [avCaptureSession startRunning]; [self setupCGContext]; CGImageRef cgImg = CGBitmapContextCreateImage(cgContext); maskImage = [CIImage imageWithCGImage:cgImg]; CGImageRelease(cgImg); } -(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { CVPixelBufferRef pixelBuffer = (CVPixelBufferRef)CMSampleBufferGetImageBuffer(sampleBuffer); CIImage *image = [CIImage imageWithCVPixelBuffer:pixelBuffer]; image = [CIFilter filterWithName:@"CISepiaTone" keysAndValues:kCIInputImageKey, image, @"inputIntensity", [NSNumber numberWithFloat:0.8], nil].outputImage; [coreImageContext drawImage:image atPoint:CGPointZero fromRect:[image extent] ]; [self.context presentRenderbuffer:GL_RENDERBUFFER]; } -(void)setupCGContext { CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); NSUInteger bytesPerPixel = 4; NSUInteger bytesPerRow = bytesPerPixel * screenSize.width; NSUInteger bitsPerComponent = 8; cgContext = CGBitmapContextCreate(NULL, screenSize.width, screenSize.height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast); CGColorSpaceRelease(colorSpace); } 

The sepia filter works, but the video is a bit slower. When I do not apply a filter, the video is normal. Any idea on how I can improve the video and make it faster?

Thanks.

+6
source share
3 answers

As I describe here , the sepia filter in Core Image was not able to work in real time, but there may be other filters. It depends on the hardware capabilities of the target device, as well as on the version of iOS (Core Image has significantly improved performance compared to the latest versions of iOS).

However, if I turn on my open source environment again, GPUImage allows me to do this much faster. It can apply a sepia tone filter on a 640x480 2.5ms camcorder on iPhone 4, which is more than fast enough for 30 FPS video from this camera.

The following code will directly filter the video from the rear camera on the iOS device, displaying this video in portrait-oriented form:

 videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack]; sepiaFilter = [[GPUImageSepiaFilter alloc] init]; GPUImageRotationFilter *rotationFilter = [[GPUImageRotationFilter alloc] initWithRotation:kGPUImageRotateRight]; [videoCamera addTarget:rotationFilter]; [rotationFilter addTarget:sepiaFilter]; filterView = [[GPUImageView alloc] initWithFrame:self.view.bounds]; [self.view addSubview:filterView]; [sepiaFilter addTarget:filterView]; [videoCamera startCameraCapture]; 
+11
source

I understand this is an old question, but ...

 [dataOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()]; 

this line calls your video in the main stream (UI).

If you change it to something like:

 [dataOutput setSampleBufferDelegate:self queue:dispatch_queue_create("cQ", DISPATCH_QUEUE_SERIAL)]; 

Then in your callback, if you need to update your interface, you should do:

 dispatch_async(dispatch_get_main_queue(), ^{ [coreImageContext drawImage:image atPoint:CGPointZero fromRect:[image extent] ]; [self.context presentRenderbuffer:GL_RENDERBUFFER]; }); 

This will help a lot since computationally expensive things will run in the background thread, and drawing the image will not affect the capture.

Side note:

Blindly, using sample code that you find on the Internet without reading about how the technology works is not a good way to develop applications (many of them are guilty of this)

+3
source

Following:

 CIFilter filterWithName:@"CISepiaTone" 

called every time you get a buffer / frame. You only need to create an ONCE filter. So move it outside, and you can still use a filter.

+2
source

Source: https://habr.com/ru/post/905450/


All Articles