1
我試圖從網絡攝像頭顯示的圖像中獲取像素顏色。我想看看像素顏色隨着時間如何變化。我目前的解決方案吸收了很多CPU,它的工作原理和給出了正確的答案,但我不是100%確定如果我正確地做到這一點,或者我可以減少一些步驟。從網絡攝像頭獲取像素顏色
- (IBAction)addFrame:(id)sender
{
// Get the most recent frame
// This must be done in a @synchronized block because the delegate method that sets the most recent frame is not called on the main thread
CVImageBufferRef imageBuffer;
@synchronized (self) {
imageBuffer = CVBufferRetain(mCurrentImageBuffer);
}
if (imageBuffer) {
// Create an NSImage and add it to the movie
// I think I can remove some steps here, but not sure where.
NSCIImageRep *imageRep = [NSCIImageRep imageRepWithCIImage:[CIImage imageWithCVImageBuffer:imageBuffer]];
NSSize n = {320,160 };
//NSImage *image = [[[NSImage alloc] initWithSize:[imageRep size]] autorelease];
NSImage *image = [[[NSImage alloc] initWithSize:n] autorelease];
[image addRepresentation:imageRep];
CVBufferRelease(imageBuffer);
NSBitmapImageRep* raw_img = [NSBitmapImageRep imageRepWithData:[image TIFFRepresentation]];
NSLog(@"image width is %f", [image size].width);
NSColor* color = [raw_img colorAtX:1279 y:120];
float colourValue = [color greenComponent]+ [color redComponent]+ [color blueComponent];
[graphView setXY:10 andY:200*colourValue/3];
NSLog(@"%0.3f", colourValue);
任何幫助表示讚賞,我很樂意嘗試其他想法。 謝謝你們。
感謝phil,我已經使用了這個指南,並試圖從緩衝區中獲取數據。 – 2011-02-13 22:45:19