I have seen this question many times in different forms both here and in other forums. Some questions can be answered, some not. There are several cases where the author or author claims to have succeeded. I have implemented examples of those that claim to be successful, but so far I do not see the same results.
I can successfully use AVAssetWriter (and AVAssetWriterInputPixelBufferAdaptor) to record image and audio data at the same time when the sample buffers are obtained from AVCaptureSession. However, if I have CGImageRef that was generated in some other way, and create a CVPixelBufferRef from scratch, the appendPixelBuffer: withPresentationTime method for AVAssetWriterInputPixelBufferAdaptor succeeds for several frames and then fails for all subsequent frames. The resulting video file is, of course, invalid.
You can see my sample code: http://pastebin.com/FCJZJmMi
Images are valid and verified by displaying them in the debug window (see lines 50-53). The application was tested with tools, and memory usage was low while the application was running. It does not receive any memory alerts.
As far as I can tell, I followed the available documentation. Why does the sample code not work? What needs to be done to fix this?
If someone successfully received AVAssetWriterInputPixelBufferAdaptor to work with their own images, please log in.
source
share