Why doesn't my image refresh when I call it from the capture output protocol?

I am trying to make something very simple. I want to display a video layer in full screen, and after every second update of UIImage with CMSampleBufferRef I received at that time. However, I ran into two different problems. The first is that the change:

[connection setVideoMaxFrameDuration:CMTimeMake(1, 1)]; [connection setVideoMinFrameDuration:CMTimeMake(1, 1)]; 

The video preview level will also be changed, I thought it would change the speed when the av-fund sends information to the delegate, but it seems to affect the entire session (which looks more obvious). Thus, this makes my video update every second. I think I could omit these lines and just add a timer to the delegate so that every second it sends a CMSampleBufferRef to another method to process it. But I do not know if this is the right approach.

My second problem is that the UIImageView is NOT updated, or sometimes it just updates once and does not change after. I use this method to update it:

 - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { //NSData *jpeg = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:sampleBuffer] ; UIImage *image = [self imageFromSampleBuffer:sampleBuffer]; [imageView setImage:image]; // Add your code here that uses the image. NSLog(@"update"); } 

What I took from apple examples. The method is called correctly every second that I checked while reading the update message. But the image does not change at all. Also, is sampleBuffer automatically destroyed or do I need to release it?

These are two other important methods: View loaded:

 - (void)viewDidLoad { [super viewDidLoad]; // Do any additional setup after loading the view, typically from a nib. session = [[AVCaptureSession alloc] init]; // Add inputs and outputs. if ([session canSetSessionPreset:AVCaptureSessionPreset640x480]) { session.sessionPreset = AVCaptureSessionPreset640x480; } else { // Handle the failure. NSLog(@"Cannot set session preset to 640x480"); } AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; NSError *error = nil; AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; if (!input) { // Handle the error appropriately. NSLog(@"Could create input: %@", error); } if ([session canAddInput:input]) { [session addInput:input]; } else { // Handle the failure. NSLog(@"Could not add input"); } // DATA OUTPUT dataOutput = [[AVCaptureVideoDataOutput alloc] init]; if ([session canAddOutput:dataOutput]) { [session addOutput:dataOutput]; dataOutput.videoSettings = [NSDictionary dictionaryWithObject: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey: (id)kCVPixelBufferPixelFormatTypeKey]; //dataOutput.minFrameDuration = CMTimeMake(1, 15); //dataOutput.minFrameDuration = CMTimeMake(1, 1); AVCaptureConnection *connection = [dataOutput connectionWithMediaType:AVMediaTypeVideo]; [connection setVideoMaxFrameDuration:CMTimeMake(1, 1)]; [connection setVideoMinFrameDuration:CMTimeMake(1, 1)]; } else { // Handle the failure. NSLog(@"Could not add output"); } // DATA OUTPUT END dispatch_queue_t queue = dispatch_queue_create("MyQueue", NULL); [dataOutput setSampleBufferDelegate:self queue:queue]; dispatch_release(queue); captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session]; [captureVideoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspect]; [captureVideoPreviewLayer setBounds:videoLayer.layer.bounds]; [captureVideoPreviewLayer setPosition:videoLayer.layer.position]; [videoLayer.layer addSublayer:captureVideoPreviewLayer]; [session startRunning]; } 

Hide CMSampleBufferRef until UIImage:

 - (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer { // Get a CMSampleBuffer Core Video image buffer for the media data CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); // Lock the base address of the pixel buffer CVPixelBufferLockBaseAddress(imageBuffer, 0); // Get the number of bytes per row for the pixel buffer void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); // Get the number of bytes per row for the pixel buffer size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); // Get the pixel buffer width and height size_t width = CVPixelBufferGetWidth(imageBuffer); size_t height = CVPixelBufferGetHeight(imageBuffer); // Create a device-dependent RGB color space CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); // Create a bitmap graphics context with the sample buffer data CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); // Create a Quartz image from the pixel data in the bitmap graphics context CGImageRef quartzImage = CGBitmapContextCreateImage(context); // Unlock the pixel buffer CVPixelBufferUnlockBaseAddress(imageBuffer,0); // Free up the context and color space CGContextRelease(context); CGColorSpaceRelease(colorSpace); // Create an image object from the Quartz image UIImage *image = [UIImage imageWithCGImage:quartzImage]; // Release the Quartz image CGImageRelease(quartzImage); return (image); } 

Thanks in advance for any help you can give me.

+4
source share
1 answer

From the documentation for the captureOutput:didOutputSampleBuffer:fromConnection: :

This method is called on the send queue specified by the sampleBufferCallbackQueue property.

This means that if you need to update the user interface using a buffer in this method, you need to do this in the main queue as follows:

 - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer: (CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { UIImage *image = [self imageFromSampleBuffer:sampleBuffer]; dispatch_async(dispatch_get_main_queue(), ^{ [imageView setImage:image]; }); } 

EDITOR: About your first questions: I'm not sure I understand this problem, but if you want to update the image only once per second, you can also compare the value of "lastImageUpdateTime" in the "didOutputSampleBuffer" method and see if enough time has passed and only update image and ignore the fetch buffer.

+8
source

Source: https://habr.com/ru/post/1399151/


All Articles