2011-03-03 64 views
3

我正在寫一個iPhone應用程序,它使用OpenCV進行某種實時圖像檢測。什麼是將CMSampleBufferRef圖像從相機(我使用AVFoundation的AVCaptureVideoDataOutputSampleBufferDelegate)轉換爲OpenCV能夠理解的IplImage的最佳方法?轉換需要足夠快,以便它可以實時運行。將CMSampleBufferRef轉換爲OpenCV IplImage的最佳/最快方法是什麼?

- (void)captureOutput:(AVCaptureOutput *)captureOutput 
    didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
    fromConnection:(AVCaptureConnection *)connection 
{ 
    NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init]; 

    // Convert CMSampleBufferRef into IplImage 
    IplImage *openCVImage = ???(sampleBuffer); 

    // Do OpenCV computations realtime 
    // ... 

    [pool release]; 
} 

在此先感謝。

回答

12

此示例代碼是基於蘋果的樣品管理CMSampleBuffer的指針:

- (IplImage *)createIplImageFromSampleBuffer:(CMSampleBufferRef)sampleBuffer { 
    IplImage *iplimage = 0; 
    if (sampleBuffer) { 
     CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
     CVPixelBufferLockBaseAddress(imageBuffer, 0); 

     // get information of the image in the buffer 
     uint8_t *bufferBaseAddress = (uint8_t *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0); 
     size_t bufferWidth = CVPixelBufferGetWidth(imageBuffer); 
     size_t bufferHeight = CVPixelBufferGetHeight(imageBuffer); 

     // create IplImage 
     if (bufferBaseAddress) { 
      iplimage = cvCreateImage(cvSize(bufferWidth, bufferHeight), IPL_DEPTH_8U, 4); 
      iplimage->imageData = (char*)bufferBaseAddress; 
     } 

     // release memory 
     CVPixelBufferUnlockBaseAddress(imageBuffer, 0); 
    } 
    else 
     DLog(@"No sampleBuffer!!"); 

    return iplimage; 
} 

你需要創建一個4通道的IplImage因爲手機的攝像頭緩存爲BGRA。

根據我的經驗,這種轉換速度足夠快,可以在實時應用程序中完成,但是當然,任何添加到其中的內容都會花費時間,特別是對於OpenCV。

+0

這種運作良好,640x480的圖像轉換時間爲0.00020秒正負0.00004我的iPhone 4 – cduck 2011-03-04 04:06:55

+0

@cduck:是的,即使在3GS上,我使用此解決方案實現了30 fps。 – 2011-03-04 08:12:12

2

「iplimage-> imageData =(char *)bufferBaseAddress;」會導致內存泄漏。它應該是「memcpy(iplimage-> imageData,(char *)bufferBaseAddress,iplimage-> imageSize);」

如此完整的編碼是:

-(IplImage *)createIplImageFromSampleBuffer:(CMSampleBufferRef)sampleBuffer { 
    IplImage *iplimage = 0; 

    if (sampleBuffer) { 
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
    CVPixelBufferLockBaseAddress(imageBuffer, 0); 

    // get information of the image in the buffer 
    uint8_t *bufferBaseAddress = (uint8_t *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0); 
    size_t bufferWidth = CVPixelBufferGetWidth(imageBuffer); 
    size_t bufferHeight = CVPixelBufferGetHeight(imageBuffer); 

    // create IplImage 
    if (bufferBaseAddress) { 
     iplimage = cvCreateImage(cvSize(bufferWidth, bufferHeight), IPL_DEPTH_8U, 4); 

     //iplimage->imageData = (char*)bufferBaseAddress; 
     memcpy(iplimage->imageData, (char*)bufferBaseAddress, iplimage->imageSize); 
    } 

    // release memory 
    CVPixelBufferUnlockBaseAddress(imageBuffer, 0); 
} 
else 
    DLog(@"No sampleBuffer!!"); 

return iplimage; 

}

相關問題