2013-08-27 32 views
4

我正在開發iPhone應用程序。在那裏,有一個要求暫停和恢復相機。所以我用AVFoundation代替UIImagePickerController在錄製視頻時更換相機捕捉設備

我的代碼是:

- (void) startup :(BOOL)isFrontCamera 
    { 

     if (_session == nil) 
     { 
      NSLog(@"Starting up server"); 

      self.isCapturing = NO; 
      self.isPaused = NO; 
      _currentFile = 0; 
      _discont = NO; 

      // create capture device with video input 
      _session = [[AVCaptureSession alloc] init]; 
      AVCaptureDevice *cameraDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; 

     if(isFrontCamera) 
     { 
      NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]; 
      AVCaptureDevice *captureDevice = nil; 
      for (AVCaptureDevice *device in videoDevices) 
      { 
       if (device.position == AVCaptureDevicePositionFront) 
       { 
        captureDevice = device; 
        break; 
       } 
      } 

      cameraDevice = captureDevice; 

     } 


      cameraDevice=[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; 
    AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice:cameraDevice error:nil]; 

      [_session addInput:input]; 

      // audio input from default mic 
      AVCaptureDevice* mic = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio]; 
      AVCaptureDeviceInput* micinput = [AVCaptureDeviceInput deviceInputWithDevice:mic error:nil]; 
      [_session addInput:micinput]; 

      // create an output for YUV output with self as delegate 
      _captureQueue = dispatch_queue_create("uk.co.gdcl.cameraengine.capture", DISPATCH_QUEUE_SERIAL); 
      AVCaptureVideoDataOutput* videoout = [[AVCaptureVideoDataOutput alloc] init]; 
      [videoout setSampleBufferDelegate:self queue:_captureQueue]; 
      NSDictionary* setcapSettings = [NSDictionary dictionaryWithObjectsAndKeys: 
              [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange], kCVPixelBufferPixelFormatTypeKey, 
              nil]; 
      videoout.videoSettings = setcapSettings; 
      [_session addOutput:videoout]; 
      _videoConnection = [videoout connectionWithMediaType:AVMediaTypeVideo]; 

      [_videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait]; 

      NSDictionary* actual = videoout.videoSettings; 
      _cy = [[actual objectForKey:@"Width"] integerValue]; 
      _cx = [[actual objectForKey:@"Height"] integerValue]; 
    AVCaptureAudioDataOutput* audioout = [[AVCaptureAudioDataOutput alloc] init]; 
      [audioout setSampleBufferDelegate:self queue:_captureQueue]; 
      [_session addOutput:audioout]; 
      _audioConnection = [audioout connectionWithMediaType:AVMediaTypeAudio]; 
    [_session startRunning]; 

      _preview = [AVCaptureVideoPreviewLayer layerWithSession:_session]; 
      _preview.videoGravity = AVLayerVideoGravityResizeAspectFill; 
     } 
    } 

我在這裏所面臨的問題,當我將相機切換到前臺。當我通過將相機更改爲前面來調用上述方法時,預覽圖層會卡住,無法預覽。我的疑問是「我們可以在拍攝會議中改變捕捉設備嗎?」。請指導我在哪裏出錯(或)建議我如何在錄製時在前後相機之間導航的解決方案。

在此先感謝。

回答

1

您不能在會議期間更改captureDevice。一次只能運行一個捕獲會話。您可以結束當前會話並創建一個新會話。會有一點滯後(可能是一兩秒鐘,取決於你的CPU負載)。

我希望蘋果允許多個會話或每個會話至少多個設備......但他們不......。

0

你有沒有考慮過有多個會話,然後處理視頻文件將它們連成一體?

4

是的,你可以。有幾件事你需要迎合。

  1. 需要使用AVCaptureVideoDataOutput及其代表進行錄製。
  2. 確保在添加新deviceInput之前刪除先前的deviceInput。
  3. 您必須刪除並重新創建AVCaptureVideoDataOutput。

我現在正在使用這兩個函數,它在會話運行時工作。

- (void)configureVideoWithDevice:(AVCaptureDevice *)camera { 

    [_session beginConfiguration]; 
    [_session removeInput:_videoInputDevice]; 
    _videoInputDevice = nil; 

    _videoInputDevice = [AVCaptureDeviceInput deviceInputWithDevice:camera error:nil]; 
    if ([_session canAddInput:_videoInputDevice]) { 
     [_session addInput:_videoInputDevice]; 
    } 

    [_session removeOutput:_videoDataOutput]; 
    _videoDataOutput = nil; 

    _videoDataOutput = [[AVCaptureVideoDataOutput alloc] init]; 
    [_videoDataOutput setSampleBufferDelegate:self queue:_outputQueueVideo]; 
    NSDictionary* setcapSettings = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange], kCVPixelBufferPixelFormatTypeKey, nil]; 

    _videoDataOutput.videoSettings = setcapSettings; 
    [_session addOutput:_videoDataOutput]; 
    _videoConnection = [_videoDataOutput connectionWithMediaType:AVMediaTypeVideo]; 

    if([_videoConnection isVideoOrientationSupported]) { 
     [_videoConnection setVideoOrientation:AVCaptureVideoOrientationLandscapeRight]; 
    } 

    [_session commitConfiguration]; 
} 

- (void)configureAudioWithDevice:(AVCaptureDevice *)microphone { 
    [_session beginConfiguration]; 
    _audioInputDevice = [AVCaptureDeviceInput deviceInputWithDevice:microphone error:nil]; 
    if ([_session canAddInput:_audioInputDevice]) { 
     [_session addInput:_audioInputDevice]; 
    } 

    [_session removeOutput:_audioDataOutput]; 
    _audioDataOutput = nil; 

    _audioDataOutput = [[AVCaptureAudioDataOutput alloc] init]; 
    [_audioDataOutput setSampleBufferDelegate:self queue:_outputQueueAudio]; 
    [_session addOutput:_audioDataOutput]; 
    _audioConnection = [_audioDataOutput connectionWithMediaType:AVMediaTypeAudio]; 

    [_session commitConfiguration]; 
} 
+0

這個算法是殺手,我唯一的問題是視頻和音頻不同步。是否有任何要求同時配置音頻和視頻? – HighFlyingFantasy

+0

@HighFlyingFantasy絕對有一種方法。它有點複雜。您將不得不手動操作音頻採樣緩衝區的時間信息以匹配視頻。由於我們正在重新創建audioOutput,因此其時間信息每次都會從零開始。在將其寫入文件之前,必須跟蹤錄製音頻的時間並調整audioSampleBuffer的值。 Geraint Davies通過他暫停和恢復視頻的例子實施了這項技術。兩者的混合應該爲你工作。 http://www.gdcl.co.uk/2013/02/20/iPhone-Pause.html – hatebyte