I am trying to take AVCaptureSession and encode mp4. It seems like this should be simple, and I'm trying to encode a single 960x540 video stream; I am not worried about audio for this problem.
When I run the following code and grab out2.mp4 from the document container using Xcode, I get a black screen in quicktime and the duration is 46 hours. At least the resolution looks right. Here is the output from ffmpeg -i out2.mp4
Input
Why can't I add sample buffers to AVAssetWriterInput in this script?
var videoInput: AVAssetWriterInput? var assetWriter: AVAssetWriter? override func viewDidLoad() { super.viewDidLoad() self.startStream() NSTimer.scheduledTimerWithTimeInterval(5, target: self, selector: "swapSegment", userInfo: nil, repeats: false) } func swapSegment() { assetWriter?.finishWritingWithCompletionHandler(){ print("File written") } videoInput = nil } func pathForOutput() -> String { let urls = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask) if let documentDirectory: NSURL = urls.first { let fileUrl = documentDirectory.URLByAppendingPathComponent("out1.mp4") return fileUrl.path! } return "" } func startStream() { assetWriter = try! AVAssetWriter(URL: NSURL(fileURLWithPath: self.pathForOutput()), fileType: AVFileTypeMPEG4) let videoSettings: [String: AnyObject] = [AVVideoCodecKey: AVVideoCodecH264, AVVideoWidthKey: 960, AVVideoHeightKey: 540] videoInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: videoSettings) videoInput!.expectsMediaDataInRealTime = true assetWriter?.addInput(videoInput!) assetWriter!.startWriting() assetWriter!.startSessionAtSourceTime(kCMTimeZero) let videoHelper = VideoHelper() videoHelper.delegate = self videoHelper.startSession() } func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBufferRef, fromConnection connection: AVCaptureConnection!) { if let videoOutput = captureOutput as? AVCaptureVideoDataOutput { videoInput?.appendSampleBuffer(sampleBuffer) } }
source share