2012-02-14 75 views
0

我想知道我怎麼可以掃描iPhone上的圖像,並分析每個像素的RGB值,從而最終確定平均RGB整個圖像。如果任何人能夠把我推向正確的方向,將不勝感激。我是圖像分析新手,不確定從哪裏開始,或者iOS 5 API中是否包含這樣的內容。如何讀取RGB像素數據在iPhone上

回答

0

從一個UIImage獲取CGImage可以給你這個數據

CFDataRef pixelData = CGDataProviderCopyData(CGImageGetDataProvider(image.CGImage)); 
const UInt8* data = CFDataGetBytePtr(pixelData); 

int pixelInfo = ((image.size.width * y) + x) * 4; // The image is png 

UInt8 red = data[pixelInfo];   
UInt8 green = data[(pixelInfo + 1)]; 
UInt8 blue = data[pixelInfo + 2];  
UInt8 alpha = data[pixelInfo + 3];  
CFRelease(pixelData); 

這裏更多: Getting pixel data from UIImageView -- works on simulator, not device

這裏:Get Pixel color of UIImage

+1

你又一次?大聲笑 – 2012-02-14 03:31:22

+0

是的,這是否使我成爲你的SO剋星? :) – danielbeard 2012-02-14 03:35:35

+0

恩......我現在要去睡覺了。 – 2012-02-14 03:36:35

3

只需粘貼它,我正在檢測觸摸顏色。

- (void) touchesEnded:(NSSet*)touches withEvent:(UIEvent*)event { 

if (self.view.hidden==YES) { 
    //color wheel is hidden, so don't handle this as a color wheel event. 
    [[self nextResponder] touchesEnded:touches withEvent:event]; 
    return; 
} 

UITouch* touch = [touches anyObject]; 
CGPoint point = [touch locationInView:self.view]; //where image was tapped 
UIColor * lastColor = [self getPixelColorAtLocation:point]; 
NSLog(@"color %@",lastColor); 
UIImageView *lbl=[[UIImageView alloc]initWithFrame:CGRectMake(0, 0, 100, 100)]; 
lbl.layer.cornerRadius=50; 
[imageView addSubview:lbl]; 
lbl.backgroundColor=lastColor; 
lbl.center=CGPointMake(stillImageFilter.center.x*320, (stillImageFilter.center.y*320)-125); 
NSLog(@"stillImageCenter = %f,%f",stillImageFilter.center.x,stillImageFilter.center.y);} 

- (UIColor*) getPixelColorAtLocation:(CGPoint)point { 
UIColor* color = nil; 
CGImageRef inImage = imageView.image.CGImage; 

CGContextRef cgctx = [self createARGBBitmapContextFromImage:inImage]; 
if (cgctx == NULL) { return nil; /* error */ } 

size_t w = CGImageGetWidth(inImage); 
size_t h = CGImageGetHeight(inImage); 
CGRect rect = {{0,0},{w,h}}; 


CGContextDrawImage(cgctx, rect, inImage); 


unsigned char* data = CGBitmapContextGetData (cgctx); 
if (data != NULL) { 

    int offset = 4*((w*round(point.y))+round(point.x)); 
    int alpha = data[offset]; 
    int red = data[offset+1]; 
    int green = data[offset+2]; 
    int blue = data[offset+3]; 
    NSLog(@"offset: %i colors: RGB A %i %i %i %i",offset,red,green,blue,alpha); 
    color = [UIColor colorWithRed:(red/255.0f) green:(green/255.0f) blue:(blue/255.0f) alpha:(alpha/255.0f)]; 
} 


CGContextRelease(cgctx); 

if (data) { free(data); } 

return color; 

}

- (CGContextRef) createARGBBitmapContextFromImage:(CGImageRef) inImage { 

CGContextRef context = NULL; 
CGColorSpaceRef colorSpace; 
void *   bitmapData; 
int    bitmapByteCount; 
int    bitmapBytesPerRow; 


size_t pixelsWide = CGImageGetWidth(inImage); 
size_t pixelsHigh = CGImageGetHeight(inImage); 


bitmapBytesPerRow = (pixelsWide * 4); 
bitmapByteCount  = (bitmapBytesPerRow * pixelsHigh); 


colorSpace = CGColorSpaceCreateDeviceRGB(); 

if (colorSpace == NULL) 
{ 
    fprintf(stderr, "Error allocating color space\n"); 
    return NULL; 
} 


bitmapData = malloc(bitmapByteCount); 
if (bitmapData == NULL) 
{ 
    fprintf (stderr, "Memory not allocated!"); 
    CGColorSpaceRelease(colorSpace); 
    return NULL; 
} 


context = CGBitmapContextCreate (bitmapData, 
           pixelsWide, 
           pixelsHigh, 
           8,  // bits per component 
           bitmapBytesPerRow, 
           colorSpace, 
           kCGImageAlphaPremultipliedFirst); 
if (context == NULL) 
{ 
    free (bitmapData); 
    fprintf (stderr, "Context not created!"); 
} 


CGColorSpaceRelease(colorSpace); 

return context; 

}

+0

因爲代碼使用createARGBBitmapContextFromImage,你可能想看看http://stackoverflow.com/questions/28759902/potential-leak-of-object-stored-in-context/28761796#28761796 – GlennRay 2015-11-21 02:08:24