What is the best / fastest way to convert CMSampleBufferRef to OpenCV IplImage?

I am writing an iPhone application that makes some kind of real-time image with OpenCV. What is the best way to convert CMSampleBufferRefcamera images (I use AVCaptureVideoDataOutputSampleBufferDelegateAVFoundation) to IplImagewhat OpenCV understands? The conversion must be fast enough so that it can work in real time.

- (void)captureOutput:(AVCaptureOutput *)captureOutput
    didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
    fromConnection:(AVCaptureConnection *)connection
{
  NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];

  // Convert CMSampleBufferRef into IplImage
  IplImage *openCVImage = ???(sampleBuffer);

  // Do OpenCV computations realtime
  // ...

  [pool release];
} 

Thanks in advance.

+3
source share
2 answers

This sample code is based on Apple's CMSampleBuffer pointer control:

- (IplImage *)createIplImageFromSampleBuffer:(CMSampleBufferRef)sampleBuffer {
    IplImage *iplimage = 0;
    if (sampleBuffer) {
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        CVPixelBufferLockBaseAddress(imageBuffer, 0);

        // get information of the image in the buffer
        uint8_t *bufferBaseAddress = (uint8_t *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0);
        size_t bufferWidth = CVPixelBufferGetWidth(imageBuffer);
        size_t bufferHeight = CVPixelBufferGetHeight(imageBuffer);

        // create IplImage
        if (bufferBaseAddress) {
            iplimage = cvCreateImage(cvSize(bufferWidth, bufferHeight), IPL_DEPTH_8U, 4);
            iplimage->imageData = (char*)bufferBaseAddress;
        }

        // release memory
        CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
    }
    else
        DLog(@"No sampleBuffer!!");

    return iplimage;
}

You need to create a 4-channel IplImage because the phone’s camera buffer is in BGRA.

, , , , , , , , OpenCV.

+12

"iplimage- > imageData = (char *) bufferBaseAddress;" .

"memcpy (iplimage- > imageData, (char *) bufferBaseAddress, iplimage- > imageSize);

:

-(IplImage *)createIplImageFromSampleBuffer:(CMSampleBufferRef)sampleBuffer {
  IplImage *iplimage = 0;

  if (sampleBuffer) {
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(imageBuffer, 0);

    // get information of the image in the buffer
    uint8_t *bufferBaseAddress = (uint8_t *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0);
    size_t bufferWidth = CVPixelBufferGetWidth(imageBuffer);
    size_t bufferHeight = CVPixelBufferGetHeight(imageBuffer);

    // create IplImage
    if (bufferBaseAddress) {
        iplimage = cvCreateImage(cvSize(bufferWidth, bufferHeight), IPL_DEPTH_8U, 4);

        //iplimage->imageData = (char*)bufferBaseAddress; 
        memcpy(iplimage->imageData, (char*)bufferBaseAddress, iplimage->imageSize);
    }

    // release memory
    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
}
else
    DLog(@"No sampleBuffer!!");

return iplimage;

}

+2

Source: https://habr.com/ru/post/1795925/


All Articles