10
我有一個程序實時查看相機輸入並獲取中間像素的顏色值。我用captureOutput:方法來從一個AVCaptureSession輸出(恰好被讀作CVPixelBuffer),然後我搶用下面的代碼的像素的RGB值搶CMSampleBuffer:從CVPixelBuffer獲取所需數據參考
// Get a CMSampleBuffer's Core Video image buffer for the media data
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// Lock the base address of the pixel buffer
CVPixelBufferLockBaseAddress(imageBuffer, 0);
// Get the number of bytes per row for the pixel buffer
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
// Get the pixel buffer width and height
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
unsigned char* pixel = (unsigned char *)CVPixelBufferGetBaseAddress(imageBuffer);
NSLog(@"Middle pixel: %hhu", pixel[((width*height)*4)/2]);
int red = pixel[(((width*height)*4)/2)+2];
int green = pixel[(((width*height)*4)/2)+1];
int blue = pixel[((width*height)*4)/2];
int alpha = 1;
UIColor *color = [UIColor colorWithRed:(red/255.0f) green:(green/255.0f) blue:(blue/255.0f) alpha:(alpha)];
我雖然該公式((寬度*高度)* 4)/ 2會讓我成爲中間像素,但它給了我圖像的頂部中間像素。我想知道什麼公式需要用來訪問屏幕中間的像素。我有點卡住,因爲我不知道這些像素緩衝區的內部結構。
在未來,我想抓住4箇中間像素,並將它們平均以獲得更準確的顏色讀數,但現在我只想了解這些東西是如何工作的。
是,兩個攝像頭在iOS設備的默認方向是橫向的,所以如果你在做任何肖像模式工作,你」你需要處理一個旋轉的幀。 – 2012-04-15 21:07:43
@Codo有什麼建議嗎? http://stackoverflow.com/questions/37611593/how-to-create-cvpixelbufferref-with-yuv420i420-data – 2016-06-07 03:47:23
好奇,爲什麼寬度部分乘以4? – OutOnAWeekend 2017-02-15 15:12:41