2012-09-26 80 views
3

我想在iOS中將yuv 420SP圖像(直接從相機捕獲,YCbCr格式)轉換爲jpg。我發現的是CGImageCreate()函數https://developer.apple.com/library/mac/documentation/graphicsimaging/reference/CGImage/Reference/reference.html#//apple_ref/doc/uid/TP30000956-CH1g-F17167,它接受了一些參數,包括字節數組包含並且應該返回一些CGImage,當它輸入到UIImageJPEGRepresentation()時,它的UIImage返回jpeg數據,但那不是真的發生。 輸出圖像數據遠不是必需的。至少輸出不是零。yuv to jpeg iOS

作爲CGImageCreate()的輸入,每個組件的位數設置爲4,每像素位數爲12,以及一些默認值。

它真的可以轉換yuv YCbCr圖像廣告不僅rgb?如果是的話,那麼我認爲我在CGImageCreate函數的輸入值中做了錯誤的事情。

+0

尋找同樣的東西。這有什麼好運? –

+0

沒有運氣。我得到的結果是這樣的: 默認編碼器,即'CGImageCreate()'只能用於從RGBA(交錯,石英不支持平面)格式圖像轉換爲JPEG。我在文檔的某個地方閱讀過,它有一個表格,其中包含每個像素的位數,每個組件的位數等所有可能的值。所有這些(可能因爲我不記得正確)對應於RGBA。沒有人對應於yuv,我確信。 – neeraj

回答

0

從我所看到的here中,CGColorSpaceRef colorspace參數只能指RGB,CMYK或灰度。

因此,我認爲首先您需要將您的YCbCr420圖像轉換爲RGB,例如,使用IPP功能YCbCr420toRGBdoc)。或者,您可以編寫自己的轉換例程,但並不困難。

0

下面的代碼,用於將通過所述captureOutput:didOutputSampleBuffer:fromConnection方法的AVVideoDataOutput返回的樣品緩衝液:

- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection 
{ 
    CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
    CVPixelBufferLockBaseAddress(pixelBuffer, 0); 
    GLubyte *rawImageBytes = CVPixelBufferGetBaseAddress(pixelBuffer); 
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(pixelBuffer); //2560 == (640 * 4) 
    size_t bufferWidth = CVPixelBufferGetWidth(pixelBuffer); 
    size_t bufferHeight = CVPixelBufferGetHeight(pixelBuffer); //480 
    size_t dataSize = CVPixelBufferGetDataSize(pixelBuffer); //1_228_808 = (2560 * 480) + 8 
CGColorSpaceRef defaultRGBColorSpace = CGColorSpaceCreateDeviceRGB(); 


    CGContextRef context = CGBitmapContextCreate(rawImageBytes, bufferWidth, bufferHeight, 8, bytesPerRow, defaultRGBColorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
    CGImageRef image = CGBitmapContextCreateImage(context); 

    CFMutableDataRef imageData = CFDataCreateMutable(NULL, 0); 
    CGImageDestinationRef destination = CGImageDestinationCreateWithData(imageData, kUTTypeJPEG, 1, NULL); 
    NSDictionary *properties = @{(__bridge id)kCGImageDestinationLossyCompressionQuality: @(0.25), 
           (__bridge id)kCGImageDestinationBackgroundColor: (__bridge id)CLEAR_COLOR, 
           (__bridge id)kCGImageDestinationOptimizeColorForSharing : @(TRUE) 
           }; 
    CGImageDestinationAddImage(destination, image, (__bridge CFDictionaryRef)properties); 

    if (!CGImageDestinationFinalize(destination)) 
    { 
     CFRelease(imageData); 
     imageData = NULL; 
    } 
    CFRelease(destination); 

    UIImage *frame = [[UIImage alloc] initWithCGImage:image]; 
    CGContextRelease(context); 
    CGImageRelease(image); 

    renderFrame([self.childViewControllers.lastObject.view viewWithTag:1].layer, frame); 

    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0); 
} 

這裏有像素格式類型的三個選項:

kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange 
kCVPixelFormatType_420YpCbCr8BiPlanarFullRange 
kCVPixelFormatType_32BGRA 

如果_captureOutput是指針參考我的例子AVVideoDataOutput,這是你如何設置像素格式的類型:

[_captureOutput setVideoSettings:@{(id)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA)}];