0
我是一名iOS初學者,並且正在關注How can I manipulate the pixel values in a CGImageRef in Xcode以學習更改CGImages。我改變了一些代碼,以便圖像中的中間像素應該被塗成紅色,而不是交換每個像素的藍色和紅色緩衝區。錯誤的接收器類型'CGImageRef'(又名'CGImage *')
但現在我得到的 「壞接收器類型 'CGImageRef'(又名 'CGImage *')」 的錯誤在此函數
manipulated = [imageRef colorMiddle];
發送消息:
- (void)renderColorFrame:(CMSampleBufferRef)sampleBuffer
{
CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
size_t cols = CVPixelBufferGetWidth(pixelBuffer);
size_t rows = CVPixelBufferGetHeight(pixelBuffer);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
unsigned char *ptr = (unsigned char *) CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);
NSData *data = [[NSData alloc] initWithBytes:ptr length:rows*cols*4];
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
CGBitmapInfo bitmapInfo;
bitmapInfo = (CGBitmapInfo)kCGImageAlphaNoneSkipFirst;
bitmapInfo |= kCGBitmapByteOrder32Little;
CGDataProviderRef provider = CGDataProviderCreateWithCFData((CFDataRef)data);
CGImageRef imageRef = CGImageCreate(cols,
rows,
8,
8 * 4,
cols*4,
colorSpace,
bitmapInfo,
provider,
NULL,
false,
kCGRenderingIntentDefault);
//hier image ref verarbeiten!!
manipulated = [imageRef colorMiddle]; //here
leftImage = CGImageCreateWithImageInRect(imageRef, CGRectMake(0, 0, self.view.frame.size.width * 6/7, self.view.frame.size.height));
rightImage = CGImageCreateWithImageInRect(imageRef, CGRectMake(self.view.frame.size.width/7, 0, self.view.frame.size.width * 6/7, self.view.frame.size.height));
//left image
_colorImageViewL.image = [[UIImage alloc] initWithCGImage:leftImage/*imageRef*/];
//right image
_colorImageViewR.image = [[UIImage alloc]initWithCGImage:rightImage/*imageRef*/];
//Full
// _colorImageViewFull.image = [[UIImage alloc]initWithCGImage:imageRef];
CGImageRelease(imageRef);
CGImageRelease(leftImage);
CGImageRelease(rightImage);
CGDataProviderRelease(provider);
CGColorSpaceRelease(colorSpace);
}
我不不明白爲什麼我在這裏得到這個錯誤,因爲消息應該與匹配參數一起發送:
@interface ViewController() <AVCaptureVideoDataOutputSampleBufferDelegate> {
STSensorController *_sensorController;
AVCaptureSession *_avCaptureSession;
AVCaptureDevice *_videoDevice;
UIImageView *_depthImageView;
//UIImageView *_depthImageView2;
//UIImageView *_normalsImageView;
//Left
UIImageView *_colorImageViewL;
//right
UIImageView *_colorImageViewR;
//Full
//UIImageView *_colorImageViewFull;
uint16_t *_linearizeBuffer;
uint8_t *_coloredDepthBuffer;
uint8_t *_normalsBuffer;
STNormalEstimator *_normalsEstimator;
UILabel* _statusLabel;
GLKMatrix4 _projection;
CGImageRef leftImage;
CGImageRef rightImage;
CGImageRef manipulated;
AppStatus _appStatus;
}
- (BOOL)connectAndStartStreaming;
- (void)renderDepthFrame:(STDepthFrame*)depthFrame;
- (void)renderNormalsFrame:(STDepthFrame*)normalsFrame;
- (void)renderColorFrame:(CMSampleBufferRef)sampleBuffer;
- (void)setupColorCamera;
- (void)startColorCamera;
- (void)stopColorCamera;
- (CGImageRef)colorMiddle:(CGImageRef)image; //here
@end
有沒有人知道什麼導致錯誤,以及如何解決這個問題? 我只是想不出任何事情,因爲所有使用CGImageRefs的地方都行得通,而且據我所知,這應該是正確的做法。
請仔細看看這個問題的不好看,因爲我仍然需要學習如何格式化正確的一切。
在此先感謝!