2011-02-27 64 views
10

第一次在這裏提問。我希望這篇文章很明確,示例代碼格式正確。AVFoundation - Retiming CMSampleBufferRef視頻輸出

我正在試驗AVFoundation和時間推移攝影。

我的目的是從iOS設備(我的iPod touch,版本4)的視頻攝像頭中抓取每個第N幀,並將每個幀寫入文件以創建一個延時。我正在使用AVCaptureVideoDataOutput,AVAssetWriter和AVAssetWriterInput。

問題是,如果我使用傳遞給

captureOutput:idOutputSampleBuffer:fromConnection:
的CMSampleBufferRef,則每幀的回放是原始輸入幀之間的時間長度。幀速率爲1fps。我期待得到30fps。

我試過使用

CMSampleBufferCreateCopyWithNewTiming()
,但在13幀寫入文件後,
captureOutput:idOutputSampleBuffer:fromConnection:
停止被調用。該界面處於活動狀態,我可以點擊一個按鈕停止捕捉並將其保存到照片庫進行播放。它看起來像我想要的那樣播放,30fps,但它只有那13幀。

我該如何實現30fps播放的目標? 我怎麼知道應用程序在哪裏迷路?爲什麼?

我已經放置了一個名爲useNativeTime的標誌,以便我可以測試這兩種情況。當設置爲YES時,我會獲得我感興趣的所有幀,因爲回調不會「丟失」。當我將該標誌設置爲NO時,我只能處理13個幀,並且從未再次返回該方法。如上所述,在這兩種情況下,我都可以播放視頻。

感謝您的任何幫助。

這裏是我正在嘗試重新定時的地方。

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection 
{ 
    BOOL useNativeTime = NO; 
    BOOL appendSuccessFlag = NO; 

    //NSLog(@"in captureOutpput sample buffer method"); 
    if(!CMSampleBufferDataIsReady(sampleBuffer)) 
    { 
     NSLog(@"sample buffer is not ready. Skipping sample"); 
     //CMSampleBufferInvalidate(sampleBuffer); 
     return; 
    } 

    if (! [inputWriterBuffer isReadyForMoreMediaData]) 
    { 
     NSLog(@"Not ready for data."); 
    } 
    else { 
     // Write every first frame of n frames (30 native from camera). 
     intervalFrames++; 
     if (intervalFrames > 30) { 
      intervalFrames = 1; 
     } 
     else if (intervalFrames != 1) { 
      //CMSampleBufferInvalidate(sampleBuffer); 
      return; 
     } 

     // Need to initialize start session time. 
     if (writtenFrames < 1) { 
      if (useNativeTime) imageSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer); 
      else imageSourceTime = CMTimeMake(0 * 20 ,600); //CMTimeMake(1,30); 
      [outputWriter startSessionAtSourceTime: imageSourceTime]; 
      NSLog(@"Starting CMtime"); 
      CMTimeShow(imageSourceTime); 
     } 

     if (useNativeTime) { 
      imageSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer); 
      CMTimeShow(imageSourceTime); 
      // CMTime myTiming = CMTimeMake(writtenFrames * 20,600); 
      // CMSampleBufferSetOutputPresentationTimeStamp(sampleBuffer, myTiming); // Tried but has no affect. 
      appendSuccessFlag = [inputWriterBuffer appendSampleBuffer:sampleBuffer]; 
     } 
     else { 
      CMSampleBufferRef newSampleBuffer; 
      CMSampleTimingInfo sampleTimingInfo; 
      sampleTimingInfo.duration = CMTimeMake(20,600); 
      sampleTimingInfo.presentationTimeStamp = CMTimeMake((writtenFrames + 0) * 20,600); 
      sampleTimingInfo.decodeTimeStamp = kCMTimeInvalid; 
      OSStatus myStatus; 

      //NSLog(@"numSamples of sampleBuffer: %i", CMSampleBufferGetNumSamples(sampleBuffer)); 
      myStatus = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault, 
                  sampleBuffer, 
                  1, 
                  &sampleTimingInfo, // maybe a little confused on this param. 
                  &newSampleBuffer); 
      // These confirm the good heath of our newSampleBuffer. 
      if (myStatus != 0) NSLog(@"CMSampleBufferCreateCopyWithNewTiming() myStatus: %i",myStatus); 
      if (! CMSampleBufferIsValid(newSampleBuffer)) NSLog(@"CMSampleBufferIsValid NOT!"); 

      // No affect. 
      //myStatus = CMSampleBufferMakeDataReady(newSampleBuffer); // How is this different; CMSampleBufferSetDataReady ? 
      //if (myStatus != 0) NSLog(@"CMSampleBufferMakeDataReady() myStatus: %i",myStatus); 

      imageSourceTime = CMSampleBufferGetPresentationTimeStamp(newSampleBuffer); 
      CMTimeShow(imageSourceTime); 
      appendSuccessFlag = [inputWriterBuffer appendSampleBuffer:newSampleBuffer]; 
      //CMSampleBufferInvalidate(sampleBuffer); // Docs don't describe action. WTF does it do? Doesn't seem to affect my problem. Used with CMSampleBufferSetInvalidateCallback maybe? 
      //CFRelease(sampleBuffer); // - Not surprisingly - 「EXC_BAD_ACCESS」 
     } 

     if (!appendSuccessFlag) 
     { 
      NSLog(@"Failed to append pixel buffer"); 
     } 
     else { 
      writtenFrames++; 
      NSLog(@"writtenFrames: %i", writtenFrames); 
      } 
    } 

    //[self displayOuptutWritterStatus]; // Expect and see AVAssetWriterStatusWriting. 
} 

我的設置程序。

- (IBAction) recordingStartStop: (id) sender 
{ 
    NSError * error; 

    if (self.isRecording) { 
     NSLog(@"~~~~~~~~~ STOPPING RECORDING ~~~~~~~~~"); 
     self.isRecording = NO; 
     [recordingStarStop setTitle: @"Record" forState: UIControlStateNormal]; 

     //[self.captureSession stopRunning]; 
     [inputWriterBuffer markAsFinished]; 
     [outputWriter endSessionAtSourceTime:imageSourceTime]; 
     [outputWriter finishWriting]; // Blocks until file is completely written, or an error occurs. 
     NSLog(@"finished CMtime"); 
     CMTimeShow(imageSourceTime); 

     // Really, I should loop through the outputs and close all of them or target specific ones. 
     // Since I'm only recording video right now, I feel safe doing this. 
     [self.captureSession removeOutput: [[self.captureSession outputs] objectAtIndex: 0]]; 

     [videoOutput release]; 
     [inputWriterBuffer release]; 
     [outputWriter release]; 
     videoOutput = nil; 
     inputWriterBuffer = nil; 
     outputWriter = nil; 
     NSLog(@"~~~~~~~~~ STOPPED RECORDING ~~~~~~~~~"); 
     NSLog(@"Calling UIVideoAtPathIsCompatibleWithSavedPhotosAlbum."); 
     NSLog(@"filePath: %@", [projectPaths movieFilePath]); 
     if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum([projectPaths movieFilePath])) { 
      NSLog(@"Calling UISaveVideoAtPathToSavedPhotosAlbum."); 
      UISaveVideoAtPathToSavedPhotosAlbum ([projectPaths movieFilePath], self, @selector(video:didFinishSavingWithError: contextInfo:), nil); 
     } 
     NSLog(@"~~~~~~~~~ WROTE RECORDING to PhotosAlbum ~~~~~~~~~"); 
    } 
    else { 
     NSLog(@"~~~~~~~~~ STARTING RECORDING ~~~~~~~~~"); 
     projectPaths = [[ProjectPaths alloc] initWithProjectFolder: @"TestProject"]; 
     intervalFrames = 30; 

     videoOutput = [[AVCaptureVideoDataOutput alloc] init]; 
     NSMutableDictionary * cameraVideoSettings = [[[NSMutableDictionary alloc] init] autorelease]; 
     NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey; 
     NSNumber* value = [NSNumber numberWithUnsignedInt: kCVPixelFormatType_32BGRA]; //kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange]; 
     [cameraVideoSettings setValue: value forKey: key]; 
     [videoOutput setVideoSettings: cameraVideoSettings]; 
     [videoOutput setMinFrameDuration: CMTimeMake(20, 600)]; //CMTimeMake(1, 30)]; // 30fps 
     [videoOutput setAlwaysDiscardsLateVideoFrames: YES]; 

     queue = dispatch_queue_create("cameraQueue", NULL); 
     [videoOutput setSampleBufferDelegate: self queue: queue]; 
     dispatch_release(queue); 

     NSMutableDictionary *outputSettings = [[[NSMutableDictionary alloc] init] autorelease]; 
     [outputSettings setValue: AVVideoCodecH264 forKey: AVVideoCodecKey]; 
     [outputSettings setValue: [NSNumber numberWithInt: 1280] forKey: AVVideoWidthKey]; // currently assuming 
     [outputSettings setValue: [NSNumber numberWithInt: 720] forKey: AVVideoHeightKey]; 

     NSMutableDictionary *compressionSettings = [[[NSMutableDictionary alloc] init] autorelease]; 
     [compressionSettings setValue: AVVideoProfileLevelH264Main30 forKey: AVVideoProfileLevelKey]; 
     //[compressionSettings setValue: [NSNumber numberWithDouble:1024.0*1024.0] forKey: AVVideoAverageBitRateKey]; 
     [outputSettings setValue: compressionSettings forKey: AVVideoCompressionPropertiesKey]; 

     inputWriterBuffer = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo outputSettings: outputSettings]; 
     [inputWriterBuffer retain]; 
     inputWriterBuffer.expectsMediaDataInRealTime = YES; 

     outputWriter = [AVAssetWriter assetWriterWithURL: [projectPaths movieURLPath] fileType: AVFileTypeQuickTimeMovie error: &error]; 
     [outputWriter retain]; 

     if (error) NSLog(@"error for outputWriter = [AVAssetWriter assetWriterWithURL:fileType:error:"); 
     if ([outputWriter canAddInput: inputWriterBuffer]) [outputWriter addInput: inputWriterBuffer]; 
     else NSLog(@"can not add input"); 

     if (![outputWriter canApplyOutputSettings: outputSettings forMediaType:AVMediaTypeVideo]) NSLog(@"ouptutSettings are NOT supported"); 

     if ([captureSession canAddOutput: videoOutput]) [self.captureSession addOutput: videoOutput]; 
     else NSLog(@"could not addOutput: videoOutput to captureSession"); 

     //[self.captureSession startRunning]; 
     self.isRecording = YES; 
     [recordingStarStop setTitle: @"Stop" forState: UIControlStateNormal]; 

     writtenFrames = 0; 
     imageSourceTime = kCMTimeZero; 
     [outputWriter startWriting]; 
     //[outputWriter startSessionAtSourceTime: imageSourceTime]; 
     NSLog(@"~~~~~~~~~ STARTED RECORDING ~~~~~~~~~"); 
     NSLog (@"recording to fileURL: %@", [projectPaths movieURLPath]); 
    } 

    NSLog(@"isRecording: %@", self.isRecording ? @"YES" : @"NO"); 

    [self displayOuptutWritterStatus]; 
} 

回答

3

隨着多一點搜索和閱讀我有一個工作解決方案。不知道這是最好的方法,但到目前爲止,這麼好。

在我的設置區域中,我設置了一個AVAssetWriterInputPixelBufferAdaptor。代碼添加看起來像這樣。

InputWriterBufferAdaptor = [AVAssetWriterInputPixelBufferAdaptor 
      assetWriterInputPixelBufferAdaptorWithAssetWriterInput: inputWriterBuffer 
      sourcePixelBufferAttributes: nil]; 
[inputWriterBufferAdaptor retain]; 

爲了完整理解下面的代碼,我在安裝方法中也有這三行。

fpsOutput = 30; //Some possible values: 30, 10, 15 24, 25, 30/1.001 or 29.97; 
cmTimeSecondsDenominatorTimescale = 600 * 100000; //To more precisely handle 29.97. 
cmTimeNumeratorValue = cmTimeSecondsDenominatorTimescale/fpsOutput; 

而不是將重定時應用於採樣緩衝區的副本。現在我有以下三行代碼可以有效地做同樣的事情。注意適配器的withPresentationTime參數。通過傳遞我的自定義價值,我獲得了我正在尋找的正確時機。

CVPixelBufferRef myImage = CMSampleBufferGetImageBuffer(sampleBuffer); 
imageSourceTime = CMTimeMake(writtenFrames * cmTimeNumeratorValue, cmTimeSecondsDenominatorTimescale); 
appendSuccessFlag = [inputWriterBufferAdaptor appendPixelBuffer: myImage withPresentationTime: imageSourceTime]; 

使用AVAssetWriterInputPixelBufferAdaptor.pixelBufferPool財產可能有一定的漲幅,但我還沒有想通了這一點。

10

好的,我在第一篇文章中發現了這個錯誤。

當使用

myStatus = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault, 
               sampleBuffer, 
               1, 
               &sampleTimingInfo, 
               &newSampleBuffer); 

你需要平衡與一個CFRelease(newSampleBuffer);

使用與AVAssetWriterInputPixelBufferAdaptor實例的piexBufferPool一個CVPixelBufferRef時同樣的想法也是如此。調用appendPixelBuffer: withPresentationTime:方法後,您將使用CVPixelBufferRelease(yourCVPixelBufferRef);

希望這對別人有幫助。

+0

謝謝,這節省了我很多痛苦。 – 2011-03-11 02:12:36

+0

不客氣。很高興聽到這個帖子的幫助。 – 2011-03-11 08:27:33

+0

謝謝!這真的拯救了我的一天。如果發現這個......你的第一個但有幫助的問題,那將需要更多的修補。 – CipherCom 2013-05-20 15:31:10