2013-10-11 169 views
7

我正在與AVCaptureVideoDataOutput並且想將CMSampleBufferRef轉換爲UIImage。很多答案都是一樣的,這樣UIImage created from CMSampleBufferRef not displayed in UIImageView?AVCaptureSession with multiple previews使用YUV色彩空間將CMSampleBufferRef轉換爲UIImage?

它工作正常,如果我設置VideoDataOutput色彩空間BGRA(相信這個答案CGBitmapContextCreateImage error

NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey; 
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]; 
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key]; 
[dataOutput setVideoSettings:videoSettings]; 

如果沒有上述videoSettings,我將接受以下錯誤

CGBitmapContextCreate: invalid data bytes/row: should be at least 2560 for 8 integer bits/component, 3 components, kCGImageAlphaPremultipliedFirst. 
<Error>: CGBitmapContextCreateImage: invalid context 0x0 

與BGRA工作並不是一個很好的選擇,因爲從YUV(默認AVCaptureSession色彩空間)BGRA轉換開銷,由布拉德和科多我說n How to get the Y component from CMSampleBuffer resulted from the AVCaptureSession?

那麼有沒有辦法將CMSampleBufferRef轉換爲UIImage並使用YUV色彩空間?

+0

這個任何解決CMSampleBufferRef轉換爲UIImage? –

+0

@AdarshVC爲什麼你不投票的問題,使它更明顯? – onmyway133

+0

檢查這個問題http://stackoverflow.com/questions/3305862/uiimage-created-from-cmsamplebufferref-not-displayed-in-uiimageview –

回答

7

經過大量的研究和閱讀蘋果文檔和維基百科。我想出了答案,這對我來說是完美的。所以對於未來的讀者林共享代碼時,視頻像素類型設置爲kCVPixelFormatType_420YpCbCr8BiPlanarFullRange

// Create a UIImage from sample buffer data 
// Works only if pixel format is kCVPixelFormatType_420YpCbCr8BiPlanarFullRange 
-(UIImage *) imageFromSamplePlanerPixelBuffer:(CMSampleBufferRef) sampleBuffer{ 

    @autoreleasepool { 
     // Get a CMSampleBuffer's Core Video image buffer for the media data 
     CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
     // Lock the base address of the pixel buffer 
     CVPixelBufferLockBaseAddress(imageBuffer, 0); 

     // Get the number of bytes per row for the plane pixel buffer 
     void *baseAddress = CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0); 

     // Get the number of bytes per row for the plane pixel buffer 
     size_t bytesPerRow = CVPixelBufferGetBytesPerRowOfPlane(imageBuffer,0); 
     // Get the pixel buffer width and height 
     size_t width = CVPixelBufferGetWidth(imageBuffer); 
     size_t height = CVPixelBufferGetHeight(imageBuffer); 

     // Create a device-dependent gray color space 
     CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceGray(); 

     // Create a bitmap graphics context with the sample buffer data 
     CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, 
                bytesPerRow, colorSpace, kCGImageAlphaNone); 
     // Create a Quartz image from the pixel data in the bitmap graphics context 
     CGImageRef quartzImage = CGBitmapContextCreateImage(context); 
     // Unlock the pixel buffer 
     CVPixelBufferUnlockBaseAddress(imageBuffer,0); 

     // Free up the context and color space 
     CGContextRelease(context); 
     CGColorSpaceRelease(colorSpace); 

     // Create an image object from the Quartz image 
     UIImage *image = [UIImage imageWithCGImage:quartzImage]; 

     // Release the Quartz image 
     CGImageRelease(quartzImage); 

     return (image); 
    } 
} 
+0

注意輸出圖像將灰度。 – Bluewings

+0

其他顏色呢? – JULIIncognito

+0

@JULIIncognito chnage將顏色空間改爲rgb – Bluewings