我正在與AVCaptureVideoDataOutput
並且想將CMSampleBufferRef
轉換爲UIImage
。很多答案都是一樣的,這樣UIImage created from CMSampleBufferRef not displayed in UIImageView?和AVCaptureSession with multiple previews使用YUV色彩空間將CMSampleBufferRef轉換爲UIImage?
它工作正常,如果我設置VideoDataOutput色彩空間BGRA(相信這個答案CGBitmapContextCreateImage error)
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[dataOutput setVideoSettings:videoSettings];
如果沒有上述videoSettings,我將接受以下錯誤
CGBitmapContextCreate: invalid data bytes/row: should be at least 2560 for 8 integer bits/component, 3 components, kCGImageAlphaPremultipliedFirst.
<Error>: CGBitmapContextCreateImage: invalid context 0x0
與BGRA工作並不是一個很好的選擇,因爲從YUV(默認AVCaptureSession色彩空間)BGRA轉換開銷,由布拉德和科多我說n How to get the Y component from CMSampleBuffer resulted from the AVCaptureSession?
那麼有沒有辦法將CMSampleBufferRef
轉換爲UIImage
並使用YUV色彩空間?
這個任何解決
CMSampleBufferRef
轉換爲UIImage
? –@AdarshVC爲什麼你不投票的問題,使它更明顯? – onmyway133
檢查這個問題http://stackoverflow.com/questions/3305862/uiimage-created-from-cmsamplebufferref-not-displayed-in-uiimageview –