我從CVPixelBuffer獲取UIIMage時遇到了一些問題。這就是我想:如何將CVPixelBuffer轉換爲UIImage?
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(imageDataSampleBuffer);
CFDictionaryRef attachments = CMCopyDictionaryOfAttachments(kCFAllocatorDefault, imageDataSampleBuffer, kCMAttachmentMode_ShouldPropagate);
CIImage *ciImage = [[CIImage alloc] initWithCVPixelBuffer:pixelBuffer options:(NSDictionary *)attachments];
if (attachments)
CFRelease(attachments);
size_t width = CVPixelBufferGetWidth(pixelBuffer);
size_t height = CVPixelBufferGetHeight(pixelBuffer);
if (width && height) { // test to make sure we have valid dimensions
UIImage *image = [[UIImage alloc] initWithCIImage:ciImage];
UIImageView *lv = [[UIImageView alloc] initWithFrame:self.view.frame];
lv.contentMode = UIViewContentModeScaleAspectFill;
self.lockedView = lv;
[lv release];
self.lockedView.image = image;
[image release];
}
[ciImage release];
height
和width
都正確地安裝在相機的分辨率。 image
被創建,但我似乎是黑色的(或者可能是透明的?)。我不太明白問題出在哪裏。任何想法,將不勝感激。
您絕對需要在兩者之間放置CIImage,例如因爲你要拋出一些中間的CIFilter,或者只是去CGBitmapContextCreate - > UIImage? – Tommy
現在,我只想在視圖中顯示它並查看我正在處理的內容。在路上,我想玩像素。 – mahboudz