2015-12-15 58 views
1

我使用AVCaptureSession, AVCaptureDeviceInput, AVCaptureVideoDataOutput捕捉來自iPhone相機的圖像。停止相機的iOS相機捕捉會議

圖像捕捉實現爲

 dispatch_queue_t sessionQueue = dispatch_queue_create("session queue", DISPATCH_QUEUE_SERIAL); 
     [self setSessionQueue:sessionQueue]; 
     //we will use a separate dispatch session not to block the main queue in processing 
     dispatch_queue_t im_processingQueue = dispatch_queue_create("im_processing queue", DISPATCH_QUEUE_SERIAL); 
     [self setIm_processingQueue:im_processingQueue]; 

     dispatch_async(sessionQueue, ^{ 
      [self setBackgroundRecordingID:UIBackgroundTaskInvalid]; 

      NSError *error = nil; 

      AVCaptureDevice *videoDevice = [RecordViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionBack]; 

      AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error]; 

      if (error) 
      { 
       NSLog(@"%@", error); 
      } 
      [(AVCaptureVideoPreviewLayer *)[[self previewView] layer] setVideoGravity:AVLayerVideoGravityResizeAspectFill]; 
      if ([session canAddInput:videoDeviceInput]) 
      { 
       [session addInput:videoDeviceInput]; 
       [self setVideoDeviceInput:videoDeviceInput]; 

       dispatch_async(dispatch_get_main_queue(), ^{ 
        // Why are we dispatching this to the main queue? 
        // Because AVCaptureVideoPreviewLayer is the backing layer for AVCamPreviewView and UIView can only be manipulated on main thread. 
        // Note: As an exception to the above rule, it is not necessary to serialize video orientation changes on the AVCaptureVideoPreviewLayer’s connection with other session manipulation. 
        //[self previewView] layer 

        [[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] setVideoOrientation:(AVCaptureVideoOrientation)[[UIApplication sharedApplication] statusBarOrientation]]; 
       }); 
      } 

      AVCaptureDevice *audioDevice = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject]; 
      AVCaptureDeviceInput *audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error]; 

      if (error) 
      { 
       NSLog(@"%@", error); 
      } 

      if ([session canAddInput:audioDeviceInput]) 
      { 
       [session addInput:audioDeviceInput]; 
      } 

      AVCaptureVideoDataOutput *vid_Output = [[AVCaptureVideoDataOutput alloc] init]; 
      [vid_Output setSampleBufferDelegate:self queue:im_processingQueue]; 
      vid_Output.alwaysDiscardsLateVideoFrames = YES; 
      // Set the video output to store frame in BGRA (It is supposed to be faster) 
      NSDictionary* videoSettings = @{(__bridge NSString*)kCVPixelBufferPixelFormatTypeKey: [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]}; 
      [vid_Output setVideoSettings:videoSettings]; 
      if ([session canAddOutput:vid_Output]) 
      { 
       [session addOutput:vid_Output]; 
       AVCaptureConnection *connection = [vid_Output connectionWithMediaType:AVMediaTypeVideo]; 
       if ([connection isVideoStabilizationSupported]) 
        //[connection setEnablesVideoStabilizationWhenAvailable:YES]; 
        connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeAuto; 
       [self setVid_Output:vid_Output]; 

      } 

     }); 

裏面viewWillAppear,捕獲會話運行爲

- (void)viewWillAppear:(BOOL)animated 
{ 
    //[super viewWillAppear:YES]; 
    dispatch_async([self sessionQueue], ^{ 
     [self addObserver:self forKeyPath:@"sessionRunningAndDeviceAuthorized" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:SessionRunningAndDeviceAuthorizedContext]; 

     [self addObserver:self forKeyPath:@"vid_Output.recording" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:RecordingContext]; 
     [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(subjectAreaDidChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:[[self videoDeviceInput] device]]; 

     __weak RecordViewController *weakSelf = self; 
     [self setRuntimeErrorHandlingObserver:[[NSNotificationCenter defaultCenter] addObserverForName:AVCaptureSessionRuntimeErrorNotification object:[self session] queue:nil usingBlock:^(NSNotification *note) { 
      RecordViewController *strongSelf = weakSelf; 
      dispatch_async([strongSelf sessionQueue], ^{ 
       // Manually restarting the session since it must have been stopped due to an error. 
       [[strongSelf session] startRunning]; 

      }); 
     }]]; 
     [[self session] startRunning]; 
    }); 
} 

然後停止爲

- (void) stopCapturingCameraImages 
{ 
    dispatch_async([self sessionQueue], ^{ 
     [[self session] stopRunning]; 

     [[NSNotificationCenter defaultCenter] removeObserver:self name:AVCaptureDeviceSubjectAreaDidChangeNotification object:[[self videoDeviceInput] device]]; 
     [[NSNotificationCenter defaultCenter] removeObserver:[self runtimeErrorHandlingObserver]]; 

     [self removeObserver:self forKeyPath:@"sessionRunningAndDeviceAuthorized" context:SessionRunningAndDeviceAuthorizedContext]; 

     [self removeObserver:self forKeyPath:@"vid_Output.recording" context:RecordingContext]; 
    }); 

} 

的問題是在消除觀察員,

[self removeObserver:self forKeyPath:@"sessionRunningAndDeviceAuthorized" context:SessionRunningAndDeviceAuthorizedContext]; 

[self removeObserver:self forKeyPath:@"vid_Output.recording" context:RecordingContext]; 

程序運行這兩個removeObservers後墜毀。 什麼可能是錯的?

編輯:

stopCapturingCameraImages is called from 
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection 
{ 

    @autoreleasepool { 
     [_processing Searchobject_using_CPU_CascadeClassifier_for:img with_return_Rect:_rects]; 

dispatch_async(dispatch_get_main_queue(), ^{ 
        for (int lc = 0; lc < 
        if(_rects.count >0){ 
         [ self stopCapturingCameraImages]; 
        } 

       }); 

} 

編輯1:

據@ SwiftArchitect的建議,我把如果([[self session] isRunning])。然後它工作。 我實現爲

- (void)viewWillDisappear:(BOOL)animated 
{ 
    [super viewWillDisappear:YES]; 
    [self stopCapturingCameraImages]; 
} 

- (void) stopCapturingCameraImages 
{ 
    dispatch_async([self sessionQueue], ^{ 
     if ([[self session] isRunning]){ 
     [[self session] stopRunning]; 

     [[NSNotificationCenter defaultCenter] removeObserver:self name:AVCaptureDeviceSubjectAreaDidChangeNotification object:[[self videoDeviceInput] device]]; 
     [[NSNotificationCenter defaultCenter] removeObserver:[self runtimeErrorHandlingObserver]]; 

      [self removeObserver:self forKeyPath:@"sessionRunningAndDeviceAuthorized" context:SessionRunningAndDeviceAuthorizedContext]; 

      [self removeObserver:self forKeyPath:@"vid_Output.recording" context:RecordingContext]; 
     } 
    }); 

} 
+0

@SwiftArchitect是的,當我在viewWillDisappear中調用stopCapturingCameraImages時,沒有程序崩潰。我有另一個地方調用stopCapturingCameraImages,一旦檢測到圖像中的對象,我需要停止運行會話以開始處理該特定圖像。它在另一個功能中實現。這個電話會讓程序崩潰。不是來自viewWillDisappear的呼叫。 – batuman

+0

@SwiftArchitect它在編輯中更新。謝謝 – batuman

回答

1

通過時間:

dispatch_async([self sessionQueue], ^{ 
    // ... 

    [self removeObserver:self forKeyPath:@"sessionRunningAndDeviceAuthorized" context:SessionRunningAndDeviceAuthorizedContext]; 

    [self removeObserver:self forKeyPath:@"vid_Output.recording" context:RecordingContext]; 
}); 

被執行,self(該UIViewController)可能已經執行其viewWillDisappear,併除去觀察者。

什麼在dispatch_get_main_queuesessionQueue執行的執行順序不一定是你所期望的,甚至是可以預見的。


的修復可能是添加一個校驗像if [[self session] isRunning]之前執行removeObserver那樣簡單,短添加旗語的。