Create CMSampleBuffer from CVPixelBuffer

I am provided with a pixelbuffer that I need to bind to an rtmpStream object from the lf.swift library in order to transfer it to youtube. it looks like this: rtmpStream.appendSampleBuffer(sampleBuffer: CMSampleBuffer, withType: CMSampleBufferType)

So, I need to somehow convert CVPixelbuffer to CMSampleBuffer to add to rtmpStream.

 var sampleBuffer: CMSampleBuffer? = nil var sampleTimingInfo: CMSampleTimingInfo = kCMTimingInfoInvalid sampleTimingInfo.presentationTimeStamp = presentationTime var formatDesc: CMVideoFormatDescription? = nil _ = CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, pixelBuffer, &formatDesc) if let formatDesc = formatDesc { CMSampleBufferCreateReadyWithImageBuffer(kCFAllocatorDefault, pixelBuffer, formatDesc, &sampleTimingInfo, &sampleBuffer) } if let sampleBuffer = sampleBuffer { self.rtmpStream.appendSampleBuffer(sampleBuffer, withType: CMSampleBufferType.video) } 

but unfortunately this does not work. The streaming library is tested and works great when I stream camera input or screenCapture. I think the problem might be sampleTimingInfo, because it requrires decodeTime and Duration, which I don’t know about how to get for the provided CVPixelBuffer.

+5
source share

Source: https://habr.com/ru/post/1269234/


All Articles