2012-01-06 26 views
0

我想使用AVCaptureSession製作相機應用程序。現在我只想看看視頻輸入是否有效。但它看起來沒有任何輸入,我似乎無法理解爲什麼。未找到AVCapture會話。無法添加視頻輸入

- (void)viewDidLoad 
{ 
    [super viewDidLoad]; 

    session = [[AVCaptureSession alloc] init]; 

    [self addVideoPreviewLayer]; 

    CGRect layerRect = [[[self view] layer] bounds]; 

    [[self previewLayer] setBounds:layerRect]; 
    [[self previewLayer] setPosition:CGPointMake(CGRectGetMidX(layerRect), 
                    CGRectGetMidY(layerRect))]; 
    [[[self view] layer] addSublayer:[self previewLayer]]; 

    UIButton *myButton = [UIButton buttonWithType:UIButtonTypeRoundedRect]; 
    myButton.frame = CGRectMake(80, 320, 200, 44); 
    [myButton setTitle:@"Click Me!" forState:UIControlStateNormal]; 
    [myButton addTarget:self action:@selector(scanButtonPressed) forControlEvents:UIControlEventTouchDown]; 
    [self.view addSubview:myButton]; 
} 

-(void)addVideoPreviewLayer 
{ 
    [self setPreviewLayer:[[[AVCaptureVideoPreviewLayer alloc] initWithSession:[self session]] autorelease]]; 
    [[self previewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill]; 
} 

-(void) addVideoInput 
{ 
    AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; 
    if (videoDevice) 
    { 
     NSError *error; 
     AVCaptureDeviceInput *videoIn = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error]; 
     if (!error) 
     { 
      if ([[self session] canAddInput:videoIn]) 
       [[self session] addInput:videoIn]; 
      else 
       NSLog(@"Couldn't add video input");  
     } 
     else 
      NSLog(@"Couldn't create video input"); 
    } 
    else 
     NSLog(@"Couldn't create video capture device"); 
} 

-(IBAction)scanButtonPressed 
{ 
    [self addVideoInput]; 
} 
+0

此代碼的結果(控制檯輸出)是什麼? – Till 2012-01-06 21:32:57

+0

@Till無法添加視頻輸入。 – ilaunchpad 2012-01-06 21:43:40

回答

0

下面是我該怎麼做。這是從多個函數中壓縮出來的,因此它可能不是可編譯的代碼,並且大部分錯誤處理已被刪除。

captureSession = [[AVCaptureSession alloc] init]; 
captureSession.sessionPreset = AVCaptureSessionPresetMedium; 

AVCaptureDevice *videoDevice; 
videoDevice = [self frontFacingCamera]; 
if (videoDevice == nil) { 
    videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];  
} 

if (videoDevice) { 
    NSError *error; 
    videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error]; 

    [captureSession addInput:self.videoInput]; 
} 

videoOutput = [[AVCaptureVideoDataOutput alloc] init]; 

[videoOutput setAlwaysDiscardsLateVideoFrames:NO]; 

AVCaptureConnection *conn = [videoOutput connectionWithMediaType:AVMediaTypeVideo]; 

if (conn.supportsVideoMinFrameDuration) 
    conn.videoMinFrameDuration = CMTimeMake(1, frameRate); 
if (conn.supportsVideoMaxFrameDuration) 
    conn.videoMaxFrameDuration = CMTimeMake(1, frameRate); 

NSDictionary *videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]; 
    [videoOutput setVideoSettings:videoSettings]; 
[videoOutput setSampleBufferDelegate:self queue:capture_queue]; 

if ([captureSession canAddOutput:videoOutput]) 
    [captureSession addOutput:videoOutput]; 
else 
    NSLog(@"Couldn't add video output");  

[self.captureSession startRunning]; 

previewLayer.session = captureSession; 
+0

謝謝。我改變了一些東西,它的工作。 – ilaunchpad 2012-01-09 16:42:57