2012-02-16 42 views
9

您好我想安裝AV捕獲會話使用iphone的相機捕捉與特定分辨率(如果可能的話,具有特定質量)的圖像的分辨率和質量。這裏的setupping AV會話代碼AVCaptureSession指定拍攝的圖像OBJ-C的iPhone應用程序

// Create and configure a capture session and start it running 
- (void)setupCaptureSession 
{ 
    NSError *error = nil; 

    // Create the session 
    self.captureSession = [[AVCaptureSession alloc] init]; 

    // Configure the session to produce lower resolution video frames, if your 
    // processing algorithm can cope. We'll specify medium quality for the 
    // chosen device. 
    captureSession.sessionPreset = AVCaptureSessionPresetMedium; 

    // Find a suitable AVCaptureDevice 
    NSArray *cameras=[AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]; 
    AVCaptureDevice *device; 
    if ([UserDefaults camera]==UIImagePickerControllerCameraDeviceFront) 
    { 
     device =[cameras objectAtIndex:1]; 
    } 
    else 
    { 
     device = [cameras objectAtIndex:0]; 
    }; 

    // Create a device input with the device and add it to the session. 
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; 
    if (!input) 
    { 
     NSLog(@"PANIC: no media input"); 
    } 
    [captureSession addInput:input]; 

    // Create a VideoDataOutput and add it to the session 
    AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init]; 
    [captureSession addOutput:output]; 
    NSLog(@"connections: %@", output.connections); 

    // Configure your output. 
    dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL); 
    [output setSampleBufferDelegate:self queue:queue]; 
    dispatch_release(queue); 

    // Specify the pixel format 
    output.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]; 


    // If you wish to cap the frame rate to a known value, such as 15 fps, set 
    // minFrameDuration. 


    // Assign session to an ivar. 
    [self setSession:captureSession]; 
    [self.captureSession startRunning]; 
} 

setSession

-(void)setSession:(AVCaptureSession *)session 
{ 
    NSLog(@"setting session..."); 
    self.captureSession=session; 
    NSLog(@"setting camera view"); 
    self.previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session]; 
    //UIView *aView = self.view; 
    CGRect videoRect = CGRectMake(20.0, 20.0, 280.0, 255.0); 
    previewLayer.frame = videoRect; // Assume you want the preview layer to fill the view. 
    [previewLayer setBackgroundColor:[[UIColor grayColor] CGColor]]; 
    [self.view.layer addSublayer:previewLayer]; 
    //[aView.layer addSublayer:previewLayer]; 
} 

和輸出方式:

// Delegate routine that is called when a sample buffer was written 
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
    fromConnection:(AVCaptureConnection *)connection 
{ 
    //NSLog(@"captureOutput: didOutputSampleBufferFromConnection"); 

    // Create a UIImage from the sample buffer data 
    self.currentImage = [self imageFromSampleBuffer:sampleBuffer]; 

    //< Add your code here that uses the image > 
} 

// Create a UIImage from sample buffer data 
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer 
{ 
    //NSLog(@"imageFromSampleBuffer: called"); 
    // Get a CMSampleBuffer's Core Video image buffer for the media data 
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
    // Lock the base address of the pixel buffer 
    CVPixelBufferLockBaseAddress(imageBuffer, 0); 

    // Get the number of bytes per row for the pixel buffer 
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); 

    // Get the number of bytes per row for the pixel buffer 
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    // Get the pixel buffer width and height 
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer); 

    // Create a device-dependent RGB color space 
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 

    // Create a bitmap graphics context with the sample buffer data 
    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
    // Create a Quartz image from the pixel data in the bitmap graphics context 
    CGImageRef quartzImage = CGBitmapContextCreateImage(context); 
    // Unlock the pixel buffer 
    CVPixelBufferUnlockBaseAddress(imageBuffer,0); 


    // Free up the context and color space 
    CGContextRelease(context); 
    CGColorSpaceRelease(colorSpace); 

    // Create an image object from the Quartz image 
    UIImage *image = [UIImage imageWithCGImage:quartzImage]; 

    // Release the Quartz image 
    CGImageRelease(quartzImage); 

    return (image); 
} 

一切都非常標準。但是,我應該更改哪些地方和哪些內容來指定拍攝圖像的分辨率和質量。請幫助我

+0

[結帳類似的問題(HTTP://計算器。 COM /問題/ 24758407/IOS捕獲高分辨率,照片,同時,使用-A-低avcapturesessionpreset換v/40609268#40609268)。這可能有幫助。 – 2016-11-15 12:00:44

回答

11

參考Apple's guide拍攝靜止圖像有關部分,其大小,如果你設置一個或另一個預設,你會得到。

你應該改變的參數是captureSession.sessionPreset

+0

我在我的應用程序中使用「UISlider」,用戶將指定他想要的更高或更低的值。但是哪些值更高? – Oleg 2012-02-16 14:47:56

+0

'NSString * const AVCaptureSessionPresetPhoto; NSString * const AVCaptureSessionPresetHigh; NSString * const AVCaptureSessionPresetMedium; NSString * const AVCaptureSessionPresetLow; NSString * const AVCaptureSessionPreset320x240; 的NSString * const的AVCaptureSessionPreset352x288; 的NSString * const的AVCaptureSessionPreset640x480; 的NSString * const的AVCaptureSessionPreset960x540; 的NSString * const的AVCaptureSessionPreset1280x720;' – Oleg 2012-02-16 14:48:19

+0

那麼顯然與更優質的那些更高。 – Eugene 2012-02-16 20:02:27

0

嘗試這樣的事情去哪裏Cx和Cy是你的自定義分辨率:

NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: 
           AVVideoScalingModeResizeAspectFill,AVVideoScalingModeKey, 
          AVVideoCodecH264, AVVideoCodecKey, 
          [NSNumber numberWithInt:cx], AVVideoWidthKey, 
          [NSNumber numberWithInt:cx], AVVideoHeightKey, 
          nil]; 
_videoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings]; 
相關問題