I capture video and process the resulting YUV frames. The output is as follows: 
Although it is usually displayed on the phone screen. But my peer gets it like this img above. Each element is repeated and shifted by some value horizontally and vertically.
My captured video is 352x288, and my YPixelCount = 101376, UVPixelCount = YPIXELCOUNT / 4
Any hint to solve this or a starting point to understand how to handle YUV video clips on iOS?
NSNumber* recorderValue = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange]; [videoRecorderSession setSessionPreset:AVCaptureSessionPreset352x288];
And this is the captureOutput function
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{ if(CMSampleBufferIsValid(sampleBuffer) && CMSampleBufferDataIsReady(sampleBuffer) && ([self isQueueStopped] == FALSE)) { CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); CVPixelBufferLockBaseAddress(imageBuffer,0); UInt8 *baseAddress[3] = {NULL,NULL,NULL}; uint8_t *yPlaneAddress = (uint8_t *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer,0); UInt32 yPixelCount = CVPixelBufferGetWidthOfPlane(imageBuffer,0) * CVPixelBufferGetHeightOfPlane(imageBuffer,0); uint8_t *uvPlaneAddress = (uint8_t *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer,1); UInt32 uvPixelCount = CVPixelBufferGetWidthOfPlane(imageBuffer,1) * CVPixelBufferGetHeightOfPlane(imageBuffer,1); UInt32 p,q,r; p=q=r=0; memcpy(uPointer, uvPlaneAddress, uvPixelCount); memcpy(vPointer, uvPlaneAddress+uvPixelCount, uvPixelCount); memcpy(yPointer,yPlaneAddress,yPixelCount); baseAddress[0] = (UInt8*)yPointer; baseAddress[1] = (UInt8*)uPointer; baseAddress[2] = (UInt8*)vPointer; CVPixelBufferUnlockBaseAddress(imageBuffer,0); } }
Is there something wrong with the above code?
source share