AVAssetWriterInput - black screen, 46 hours

I am trying to take AVCaptureSession and encode mp4. It seems like this should be simple, and I'm trying to encode a single 960x540 video stream; I am not worried about audio for this problem.

When I run the following code and grab out2.mp4 from the document container using Xcode, I get a black screen in quicktime and the duration is 46 hours. At least the resolution looks right. Here is the output from ffmpeg -i out2.mp4

 Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'out2.mp4': Metadata: major_brand : mp42 minor_version : 1 compatible_brands: mp41mp42isom creation_time : 2015-11-18 01:25:55 Duration: 46:43:04.21, start: 168178.671667, bitrate: 0 kb/s Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, smpte170m/bt709/bt709), 960x540, 1860 kb/s, 27.65 fps, 29.97 tbr, 600 tbn, 1200 tbc (default) Metadata: creation_time : 2015-11-18 01:25:55 handler_name : Core Media Video 

Why can't I add sample buffers to AVAssetWriterInput in this script?

 var videoInput: AVAssetWriterInput? var assetWriter: AVAssetWriter? override func viewDidLoad() { super.viewDidLoad() self.startStream() NSTimer.scheduledTimerWithTimeInterval(5, target: self, selector: "swapSegment", userInfo: nil, repeats: false) } func swapSegment() { assetWriter?.finishWritingWithCompletionHandler(){ print("File written") } videoInput = nil } func pathForOutput() -> String { let urls = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask) if let documentDirectory: NSURL = urls.first { let fileUrl = documentDirectory.URLByAppendingPathComponent("out1.mp4") return fileUrl.path! } return "" } func startStream() { assetWriter = try! AVAssetWriter(URL: NSURL(fileURLWithPath: self.pathForOutput()), fileType: AVFileTypeMPEG4) let videoSettings: [String: AnyObject] = [AVVideoCodecKey: AVVideoCodecH264, AVVideoWidthKey: 960, AVVideoHeightKey: 540] videoInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: videoSettings) videoInput!.expectsMediaDataInRealTime = true assetWriter?.addInput(videoInput!) assetWriter!.startWriting() assetWriter!.startSessionAtSourceTime(kCMTimeZero) let videoHelper = VideoHelper() videoHelper.delegate = self videoHelper.startSession() } func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBufferRef, fromConnection connection: AVCaptureConnection!) { if let videoOutput = captureOutput as? AVCaptureVideoDataOutput { videoInput?.appendSampleBuffer(sampleBuffer) } } 
+5
source share
1 answer

Your presentation time may not apply to your sourceTime ( kCMTimeZero ). You can use the first timestamp of the buffer presentation as the source time.

ps maybe 46 hours is about the time your device’s working

+4
source

Source: https://habr.com/ru/post/1236199/


All Articles