After a detailed review of WWDC2014, Session513, I am trying to write my application on IOS8.0 to decode and display a single stream in real time of H.264. First of all, I successfully create an H264 parameter set. When I get one I-frame with a 4-bit start code, like "0x00 0x00 0x00 0x01 0x65 ...", I put it in CMblockBuffer. Then I create a CMSampleBuffer using the CMBlockBuffer preview. After that, I put the CMSampleBuffer in the AVSampleBufferDisplayLayer. Everything is in order (I checked the return value), except that AVSampleBufferDisplayLayer does not show any video image. Since these APIs are completely new to everyone, I could not find a single body that could solve this problem.
I will give the key codes as follows, and I really appreciate it if you can help figure out why the video image cannot be displayed. Many thanks.
(1) AVSampleBufferDisplayLayer initialized. dsplayer is an objc instance of my main view controller.
@property(nonatomic,strong)AVSampleBufferDisplayLayer *dspLayer; if(!_dspLayer) { _dspLayer = [[AVSampleBufferDisplayLayer alloc]init]; [_dspLayer setFrame:CGRectMake(90,551,557,389)]; _dspLayer.videoGravity = AVLayerVideoGravityResizeAspect; _dspLayer.backgroundColor = [UIColor grayColor].CGColor; CMTimebaseRef tmBase = nil; CMTimebaseCreateWithMasterClock(NULL,CMClockGetHostTimeClock(),&tmBase); _dspLayer.controlTimebase = tmBase; CMTimebaseSetTime(_dspLayer.controlTimebase, kCMTimeZero); CMTimebaseSetRate(_dspLayer.controlTimebase, 1.0); [self.view.layer addSublayer:_dspLayer]; }
(2) In another thread, I get one H.264 I frame. // create a set of h.264 ok parameters
CMVideoFormatDescriptionRef formatDesc; OSStatus formatCreateResult = CMVideoFormatDescriptionCreateFromH264ParameterSets(NULL, ppsNum+1, props, sizes, 4, &formatDesc); NSLog([NSString stringWithFormat:@"construct h264 param set:%ld",formatCreateResult]);
// build cmBlockbuffer. // databuf points to H.264 data. starts with "0x00 0x00 0x00 0x01 0x65 ........"
CMBlockBufferRef blockBufferOut = nil; CMBlockBufferCreateEmpty (0,0,kCMBlockBufferAlwaysCopyDataFlag, &blockBufferOut); CMBlockBufferAppendMemoryBlock(blockBufferOut, dataBuf, dataLen, NULL, NULL, 0, dataLen, kCMBlockBufferAlwaysCopyDataFlag);
// build cmsamplebuffer ok
size_t sampleSizeArray[1] = {0}; sampleSizeArray[0] = CMBlockBufferGetDataLength(blockBufferOut); CMSampleTiminginfo tmInfos[1] = { {CMTimeMake(5,1), CMTimeMake(5,1), CMTimeMake(5,1)} }; CMSampleBufferRef sampBuf = nil; formatCreateResult = CMSampleBufferCreate(kCFAllocatorDefault, blockBufferOut, YES, NULL, NULL, formatDesc, 1, 1, tmInfos, 1, sampleSizeArray, &sampBuf);
// put only one frame in AVSampleBufferdisplayLayer. But I do not see the video frames in my view
if([self.dspLayer isReadyForMoreMediaData]) { [self.dspLayer enqueueSampleBuffer:sampBuf]; } [self.dspLayer setNeedsDisplay];