Can I use AVCaptureVideoDataOutput and AVCaptureMovieFileOutput at the same time?

I want to record video and capture frames at the same time as my code.

I use AVCaptureVideoDataOutput for capture frames and AVCaptureMovieFileOutput for video recording. But it can’t work and get error code -12780, working simultaneously, but individually.

I searched for this problem but received no answer. Has anyone had the same experience or explained? It really bothers me for a while.

thank.

+29
iphone image-processing video-capture avfoundation
Feb 09 '11 at 11:14
source share
2 answers

I cannot answer the asked question, but I successfully recorded the video and captured frames at the same time using:

  • AVCaptureSession and AVCaptureVideoDataOutput for routing frames to my own code
  • AVAssetWriter , AVAssetWriterInput and AVAssetWriterInputPixelBufferAdaptor for writing frames to a H.264 encoded file

This is without studying the sound. As a result, I get CMSampleBuffers from the capture session, and then insert them into the pixel buffer adapter.

EDIT: so my code looks more or less similar: with bits you have no problem viewing and ignoring area problems:

 /* to ensure I'm given incoming CMSampleBuffers */ AVCaptureSession *captureSession = alloc and init, set your preferred preset/etc; AVCaptureDevice *captureDevice = default for video, probably; AVCaptureDeviceInput *deviceInput = input with device as above, and attach it to the session; AVCaptureVideoDataOutput *output = output for 32BGRA pixel format, with me as the delegate and a suitable dispatch queue affixed. /* to prepare for output; I'll output 640x480 in H.264, via an asset writer */ NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:640], AVVideoWidthKey, [NSNumber numberWithInt:480], AVVideoHeightKey, AVVideoCodecH264, AVVideoCodecKey, nil]; AVAssetWriterInput *assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSettings]; /* I'm going to push pixel buffers to it, so will need a AVAssetWriterPixelBufferAdaptor, to expect the same 32BGRA input as I've asked the AVCaptureVideDataOutput to supply */ AVAssetWriterInputPixelBufferAdaptor *pixelBufferAdaptor = [[AVAssetWriterInputPixelBufferAdaptor alloc] initWithAssetWriterInput:assetWriterInput sourcePixelBufferAttributes: [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey, nil]]; /* that going to go somewhere, I imagine you've got the URL for that sorted, so create a suitable asset writer; we'll put our H.264 within the normal MPEG4 container */ AVAssetWriter *assetWriter = [[AVAssetWriter alloc] initWithURL:URLFromSomwhere fileType:AVFileTypeMPEG4 error:you need to check error conditions, this example is too lazy]; [assetWriter addInput:assetWriterInput]; /* we need to warn the input to expect real time data incoming, so that it tries to avoid being unavailable at inopportune moments */ assetWriterInput.expectsMediaDataInRealTime = YES; ... eventually ... [assetWriter startWriting]; [assetWriter startSessionAtSourceTime:kCMTimeZero]; [captureSession startRunning]; ... elsewhere ... - (void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); // a very dense way to keep track of the time at which this frame // occurs relative to the output stream, but it just an example! static int64_t frameNumber = 0; if(assetWriterInput.readyForMoreMediaData) [pixelBufferAdaptor appendPixelBuffer:imageBuffer withPresentationTime:CMTimeMake(frameNumber, 25)]; frameNumber++; } ... and, to stop, ensuring the output file is finished properly ... [captureSession stopRunning]; [assetWriter finishWriting]; 
+49
Feb 09 '11 at 12:03
source share

This is Tommy's quick answer.

  // Set up the Capture Session // Add the Inputs // Add the Outputs var outputSettings = [ AVVideoWidthKey : Int(640), AVVideoHeightKey : Int(480), AVVideoCodecKey : .h264 ] var assetWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo,outputSettings: outputSettings) var pixelBufferAdaptor = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput, sourcePixelBufferAttributes: [ kCVPixelBufferPixelFormatTypeKey : Int(kCVPixelFormatType_32BGRA)]) var assetWriter = AVAssetWriter(url: URLFromSomwhere, fileType: AVFileTypeMPEG4 , error : Error ) assetWriter.addInput(assetWriterInput) assetWriterInput.expectsMediaDataInRealTime = true assetWriter.startWriting() assetWriter.startSession(atSourceTime: kCMTimeZero) captureSession.startRunning() func captureOutput(_ captureOutput: AVCaptureOutput, didOutputSampleBuffer sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) { var imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) // a very dense way to keep track of the time at which this frame // occurs relative to the output stream, but it just an example! var frameNumber: Int64 = 0 if assetWriterInput.readyForMoreMediaData { pixelBufferAdaptor.appendPixelBuffer(imageBuffer, withPresentationTime: CMTimeMake(frameNumber, 25)) } frameNumber += 1 } captureSession.stopRunning() assetWriter.finishWriting() 

I do not guarantee 100 percent accuracy because I am new to fast.

+1
Jan 05 '17 at 11:14
source share



All Articles