2010-11-22 88 views

回答

4

執行此操作的最佳方法是使用AVCaptureSession對象。我正在使用我的免費應用程序「Live Effects Cam」正在進行的操作。

在線上有幾個代碼示例可幫助您實現此目的。以下是可能有所幫助的一段代碼示例:

- (void) activateCameraFeed 
    { 
    videoSettings = nil; 

#if USE_32BGRA 
    pixelFormatCode = [[NSNumber alloc] initWithUnsignedInt:(unsigned int)kCVPixelFormatType_32BGRA]; 
    pixelFormatKey = [[NSString alloc] initWithString:(NSString *)kCVPixelBufferPixelFormatTypeKey]; 
    videoSettings = [[NSDictionary alloc] initWithObjectsAndKeys:pixelFormatCode, pixelFormatKey, nil]; 
#endif 

    videoDataOutputQueue = dispatch_queue_create("com.jellyfilledstudios.ImageCaptureQueue", NULL); 

    captureVideoOutput = [[AVCaptureVideoDataOutput alloc] init]; 
    [captureVideoOutput setAlwaysDiscardsLateVideoFrames:YES]; 
    [captureVideoOutput setSampleBufferDelegate:self queue:videoDataOutputQueue]; 
    [captureVideoOutput setVideoSettings:videoSettings]; 
    [captureVideoOutput setMinFrameDuration:kCMTimeZero]; 

    dispatch_release(videoDataOutputQueue); // AVCaptureVideoDataOutput uses dispatch_retain() & dispatch_release() so we can dispatch_release() our reference now 

    if (useFrontCamera) 
     { 
     currentCameraDeviceIndex = frontCameraDeviceIndex; 
     cameraImageOrientation = UIImageOrientationLeftMirrored; 
     } 
    else 
     { 
     currentCameraDeviceIndex = backCameraDeviceIndex; 
     cameraImageOrientation = UIImageOrientationRight; 
     } 

    selectedCamera = [[AVCaptureDevice devices] objectAtIndex:(NSUInteger)currentCameraDeviceIndex]; 

    captureVideoInput = [AVCaptureDeviceInput deviceInputWithDevice:selectedCamera error:nil]; 

    captureSession = [[AVCaptureSession alloc] init]; 

    [captureSession beginConfiguration]; 

    [self setCaptureConfiguration]; 

    [captureSession addInput:captureVideoInput]; 
    [captureSession addOutput:captureVideoOutput]; 
    [captureSession commitConfiguration]; 
    [captureSession startRunning]; 
    } 


// AVCaptureVideoDataOutputSampleBufferDelegate 
// AVCaptureAudioDataOutputSampleBufferDelegate 
// 
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection 
    { 
    NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init]; 

    if (captureOutput==captureVideoOutput) 
     { 
     [self performImageCaptureFrom:sampleBuffer fromConnection:connection]; 
     } 

    [pool drain]; 
    } 



- (void) performImageCaptureFrom:(CMSampleBufferRef)sampleBuffer 
    { 
    CVImageBufferRef imageBuffer; 

    if (CMSampleBufferGetNumSamples(sampleBuffer) != 1) 
     return; 
    if (!CMSampleBufferIsValid(sampleBuffer)) 
     return; 
    if (!CMSampleBufferDataIsReady(sampleBuffer)) 
     return; 

    imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 

    if (CVPixelBufferGetPixelFormatType(imageBuffer) != kCVPixelFormatType_32BGRA) 
     return; 

    CVPixelBufferLockBaseAddress(imageBuffer,0); 

    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer); 
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer); 
    int bufferSize = bytesPerRow * height; 

    uint8_t *tempAddress = malloc(bufferSize); 
    memcpy(tempAddress, baseAddress, bytesPerRow * height); 

    baseAddress = tempAddress; 

    // 
    // Apply affects to the pixels stored in (uint32_t *)baseAddress 
    // 
    // 
    // example: grayScale((uint32_t *)baseAddress, width, height); 
    // example: sepia((uint32_t *)baseAddress, width, height); 
    // 

    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 
    CGContextRef newContext = nil; 

    if (cameraDeviceSetting != CameraDeviceSetting640x480)  // not an iPhone4 or iTouch 5th gen 
     newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaNoneSkipFirst); 
    else 
     newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 

    CGImageRef newImage = CGBitmapContextCreateImage(newContext); 
    CGColorSpaceRelease(colorSpace); 
    CGContextRelease(newContext); 

    free(tempAddress); 

    CVPixelBufferUnlockBaseAddress(imageBuffer,0); 

    if (newImage == nil) 
     { 
     return; 
     } 

    // To be able to display the CGImageRef newImage in your UI you will need to do it like this 
    // because you are running on a different thread here… 
    // 
    [self performSelectorOnMainThread:@selector(newCameraImageNotification:) withObject:(id)newImage waitUntilDone:YES]; 
    } 
+0

嘗試了您的Live Effects凸輪,它看起來不錯,它具有更多我試圖實現的功能。做得好!只是感到驚訝,它是免費的。 – BlueDolphin 2010-12-30 03:36:41

+0

謝謝。我每天下載50次下載量爲99美分,平均每天下載量超過1500次,而且是免費的。我正在發佈一個更新,其中包含應用程序購買中最受歡迎的新功能。我建議今天任何開發新應用的人都可以使用應用內購買方式的免費應用。 – 2011-01-02 02:26:39

1

您可以覆蓋圖像上的視圖並更改混合模式以匹配黑白效果。

時退房QuartzDemo從蘋果,特別是在該演示中,例如Blending Modes

1

另一種方式做,這將是使用AVFoundation每一幀轉換。我對此沒有太多的經驗,但WWDC2010的「Session 409 - 使用帶AVFoundation的攝像頭」視頻及其示例項目應該能夠幫助解決您的問題。

也就是說,如果你還好,可以使用iOS4類。