I cannot answer the asked question, but I successfully recorded the video and captured frames at the same time using:
AVCaptureSession and AVCaptureVideoDataOutput for routing frames to my own codeAVAssetWriter , AVAssetWriterInput and AVAssetWriterInputPixelBufferAdaptor for writing frames to a H.264 encoded file
This is without studying the sound. As a result, I get CMSampleBuffers from the capture session, and then insert them into the pixel buffer adapter.
EDIT: so my code looks more or less similar: with bits you have no problem viewing and ignoring area problems:
AVCaptureSession *captureSession = alloc and init, set your preferred preset/etc; AVCaptureDevice *captureDevice = default for video, probably; AVCaptureDeviceInput *deviceInput = input with device as above, and attach it to the session; AVCaptureVideoDataOutput *output = output for 32BGRA pixel format, with me as the delegate and a suitable dispatch queue affixed. NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:640], AVVideoWidthKey, [NSNumber numberWithInt:480], AVVideoHeightKey, AVVideoCodecH264, AVVideoCodecKey, nil]; AVAssetWriterInput *assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSettings]; AVAssetWriterInputPixelBufferAdaptor *pixelBufferAdaptor = [[AVAssetWriterInputPixelBufferAdaptor alloc] initWithAssetWriterInput:assetWriterInput sourcePixelBufferAttributes: [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey, nil]]; AVAssetWriter *assetWriter = [[AVAssetWriter alloc] initWithURL:URLFromSomwhere fileType:AVFileTypeMPEG4 error:you need to check error conditions, this example is too lazy]; [assetWriter addInput:assetWriterInput]; assetWriterInput.expectsMediaDataInRealTime = YES; ... eventually ... [assetWriter startWriting]; [assetWriter startSessionAtSourceTime:kCMTimeZero]; [captureSession startRunning]; ... elsewhere ... - (void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
Tommy Feb 09 '11 at 12:03 2011-02-09 12:03
source share