2013-01-02 18 views
7

我正在用AVASsetWriter使用AVAssetWriterInputPixelBufferAdaptor寫一個MP4視頻文件。AVAssetWriter有時會失敗,狀態爲AVAssetWriterStatusFailed。似乎隨機

來源是來自UIImagePickerController的視頻,可以從相機或資產庫中新鮮捕獲。質量現在是UIImagePickerControllerQualityTypeMedium

作家有時會失敗。它的狀態是AVAssetWriterStatusFailedAVAssetWriter對象錯誤屬性是:

Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" 
UserInfo=0xf5d8990 {NSLocalizedFailureReason=An unknown error occurred (-536870210), 
NSUnderlyingError=0x4dd8e0 "The operation couldn’t be completed. (OSStatus error -536870210.)", 
NSLocalizedDescription=The operation could not be completed 

的錯誤時,代碼運行時間的大約20%。它似乎在iPhone 4/4S上比在iPhone 5上更頻繁地失敗。

如果源視頻質量更高,它也會出現得更頻繁。 使用UIImagePickerControllerQualityTypeLow錯誤不經常發生。 使用UIImagePickerControllerQualityTypeHigh,錯誤發生得更頻繁一點。

我也注意到別的東西: 它似乎在波濤洶涌。當它失敗時,即使我刪除了應用程序並重新安裝,以下運行通常也會失敗。這讓我想知道,我的程序是否泄露了一些內存,並且即使該應用程序被殺死,該內存是否仍然存在(甚至有可能?)。

下面是我用它來使我的視頻代碼:

- (void)writeVideo 
{ 
    offlineRenderingInProgress = YES; 

/* --- Writer Setup --- */ 

    [locationQueue cancelAllOperations]; 

    [self stopWithoutRewinding]; 

    NSError *writerError = nil; 

    BOOL succes; 

    succes = [[NSFileManager defaultManager] removeItemAtURL:self.outputURL error:nil]; 

    // DLog(@"Url: %@, succes: %i, error: %@", self.outputURL, succes, fileError); 

    writer = [AVAssetWriter assetWriterWithURL:self.outputURL fileType:(NSString *)kUTTypeQuickTimeMovie error:&writerError]; 
    //writer.shouldOptimizeForNetworkUse = NO; 

    if (writerError) { 
     DLog(@"Writer error: %@", writerError); 
     return; 
    } 

    float bitsPerPixel; 
    CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions((__bridge CMVideoFormatDescriptionRef)([readerVideoOutput.videoTracks[0] formatDescriptions][0])); 
    int numPixels = dimensions.width * dimensions.height; 
    int bitsPerSecond; 

    // Assume that lower-than-SD resolutions are intended for streaming, and use a lower bitrate 
    if (numPixels < (640 * 480)) 
     bitsPerPixel = 4.05; // This bitrate matches the quality produced by AVCaptureSessionPresetMedium or Low. 
    else 
     bitsPerPixel = 11.4; // This bitrate matches the quality produced by AVCaptureSessionPresetHigh. 

    bitsPerSecond = numPixels * bitsPerPixel; 

    NSDictionary *videoCompressionSettings = [NSDictionary dictionaryWithObjectsAndKeys: 
              AVVideoCodecH264, AVVideoCodecKey, 
              [NSNumber numberWithFloat:videoSize.width], AVVideoWidthKey, 
              [NSNumber numberWithInteger:videoSize.height], AVVideoHeightKey, 
              [NSDictionary dictionaryWithObjectsAndKeys: 
              [NSNumber numberWithInteger:30], AVVideoMaxKeyFrameIntervalKey, 
              nil], AVVideoCompressionPropertiesKey, 
              nil]; 

    writerVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoCompressionSettings]; 
    writerVideoInput.transform = movie.preferredTransform; 
    writerVideoInput.expectsMediaDataInRealTime = YES; 
    [writer addInput:writerVideoInput]; 

    NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: 
                 [NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil]; 

    writerPixelAdaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerVideoInput 
                         sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary]; 
    BOOL couldStart = [writer startWriting]; 

    if (!couldStart) { 
     DLog(@"Could not start AVAssetWriter!"); 
     abort = YES; 
     [locationQueue cancelAllOperations]; 
     return; 
    } 

    [self configureFilters]; 

    CIContext *offlineRenderContext = [CIContext contextWithOptions:@{kCIContextUseSoftwareRenderer : @NO}]; 


    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 

    if (!self.canEdit) { 
     [self createVideoReaderWithAsset:movie timeRange:CMTimeRangeFromTimeToTime(kCMTimeZero, kCMTimePositiveInfinity) forOfflineRender:YES]; 
    } else { 
     [self createVideoReaderWithAsset:movie timeRange:CMTimeRangeWithNOVideoRangeInDuration(self.thumbnailEditView.range, movie.duration) forOfflineRender:YES]; 
    } 

    CMTime startOffset = reader.timeRange.start; 

    DLog(@"startOffset: %llu", startOffset.value); 

    [self.thumbnailEditView removeFromSuperview]; 
    // self.thumbnailEditView = nil; 

    [glLayer removeFromSuperlayer]; 
    glLayer = nil; 

    [playerView removeFromSuperview]; 
    playerView = nil; 

    glContext = nil; 



    [writerVideoInput requestMediaDataWhenReadyOnQueue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0) usingBlock:^{ 

     @try { 


     BOOL didWriteSomething = NO; 

     DLog(@"Preparing to write..."); 

     while ([writerVideoInput isReadyForMoreMediaData]) { 

      if (abort) { 
       NSLog(@"Abort == YES"); 
       [locationQueue cancelAllOperations]; 
       [writerVideoInput markAsFinished]; 
       videoConvertCompletionBlock(NO, writer.error.localizedDescription); 
      } 

      if (writer.status == AVAssetWriterStatusFailed) { 
       DLog(@"Writer.status: AVAssetWriterStatusFailed, error: %@", writer.error); 

       [[NSUserDefaults standardUserDefaults] setObject:[NSNumber numberWithInt:1] forKey:@"QualityOverride"]; 
       [[NSUserDefaults standardUserDefaults] synchronize]; 

       abort = YES; 
       [locationQueue cancelAllOperations]; 
       videoConvertCompletionBlock(NO, writer.error.localizedDescription); 
       return; 
       DLog(@"Source file exists: %i", [[NSFileManager defaultManager] fileExistsAtPath:movie.URL.relativePath]); 
      } 

      DLog(@"Writing started..."); 

      CMSampleBufferRef buffer = nil; 

      if (reader.status != AVAssetReaderStatusUnknown) { 

       if (reader.status == AVAssetReaderStatusReading) { 
        buffer = [readerVideoOutput copyNextSampleBuffer]; 
        if (didWriteSomething == NO) { 
         DLog(@"Copying sample buffers..."); 
        } 
       } 

       if (!buffer) { 

        [writerVideoInput markAsFinished]; 

        DLog(@"Finished..."); 

        CGColorSpaceRelease(colorSpace); 

        [self offlineRenderingDidFinish]; 


        dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{ 

         [writer finishWriting]; 
         if (writer.error != nil) { 
          DLog(@"Error: %@", writer.error); 
         } else { 
          DLog(@"Succes!"); 
         } 

         if (writer.status == AVAssetWriterStatusCompleted) { 

          videoConvertCompletionBlock(YES, nil); 
         } 

         else { 
          abort = YES; 
          videoConvertCompletionBlock(NO, writer.error.localizedDescription); 
         } 

        }); 


        return; 
       } 

       didWriteSomething = YES; 
      } 
      else { 

       DLog(@"Still waiting..."); 
       //Reader just needs a moment to get ready... 
       continue; 
      } 

      CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(buffer); 

      if (pixelBuffer == NULL) { 
       DLog(@"Pixelbuffer == NULL"); 
       continue; 
      } 

      //DLog(@"Sample call back! Pixelbuffer: %lu", CVPixelBufferGetHeight(pixelBuffer)); 

      //NSDictionary *options = [NSDictionary dictionaryWithObject:(__bridge id)CGColorSpaceCreateDeviceRGB() forKey:kCIImageColorSpace]; 

      CIImage *ciimage = [CIImage imageWithCVPixelBuffer:pixelBuffer options:nil]; 

      CIImage *outputImage = [self filteredImageWithImage:ciimage]; 


      CVPixelBufferRef outPixelBuffer = NULL; 
      CVReturn status; 

      CFDictionaryRef empty; // empty value for attr value. 
      CFMutableDictionaryRef attrs; 
      empty = CFDictionaryCreate(kCFAllocatorDefault, // our empty IOSurface properties dictionary 
             NULL, 
             NULL, 
             0, 
             &kCFTypeDictionaryKeyCallBacks, 
             &kCFTypeDictionaryValueCallBacks); 

      attrs = CFDictionaryCreateMutable(kCFAllocatorDefault, 
               1, 
               &kCFTypeDictionaryKeyCallBacks, 
               &kCFTypeDictionaryValueCallBacks); 

      CFDictionarySetValue(attrs, 
           kCVPixelBufferIOSurfacePropertiesKey, 
           empty); 

      CFDictionarySetValue(attrs, 
           kCVPixelBufferCGImageCompatibilityKey, 
           (__bridge const void *)([NSNumber numberWithBool:YES])); 

      CFDictionarySetValue(attrs, 
           kCVPixelBufferCGBitmapContextCompatibilityKey, 
           (__bridge const void *)([NSNumber numberWithBool:YES])); 


      status = CVPixelBufferCreate(kCFAllocatorDefault, ciimage.extent.size.width, ciimage.extent.size.height, kCVPixelFormatType_32BGRA, attrs, &outPixelBuffer); 

      //DLog(@"Output image size: %f, %f, pixelbuffer height: %lu", outputImage.extent.size.width, outputImage.extent.size.height, CVPixelBufferGetHeight(outPixelBuffer)); 

      if (status != kCVReturnSuccess) { 
       DLog(@"Couldn't allocate output pixelBufferRef!"); 
       continue; 
      } 

      [offlineRenderContext render:outputImage toCVPixelBuffer:outPixelBuffer bounds:outputImage.extent colorSpace:colorSpace]; 

      CMTime currentSourceTime = CMSampleBufferGetPresentationTimeStamp(buffer); 
      CMTime currentTime = CMTimeSubtract(currentSourceTime, startOffset); 
      CMTime duration = reader.timeRange.duration; 
      if (CMTIME_IS_POSITIVE_INFINITY(duration)) { 
       duration = movie.duration; 
      } 
      CMTime durationConverted = CMTimeConvertScale(duration, currentTime.timescale, kCMTimeRoundingMethod_Default); 

      float durationFloat = (float)durationConverted.value; 
      float progress = ((float) currentTime.value)/durationFloat; 

      //DLog(@"duration : %f, progress: %f", durationFloat, progress); 

      [self updateOfflineRenderProgress:progress]; 

      if (pixelBuffer != NULL && writerVideoInput.readyForMoreMediaData) { 
       [writerPixelAdaptor appendPixelBuffer:outPixelBuffer withPresentationTime:currentTime]; 
      } else { 
       continue; 
      } 

      if (writer.status == AVAssetWriterStatusWriting) { 
       DLog(@"Writer.status: AVAssetWriterStatusWriting"); 
      } 

      CFRelease(buffer); 
      CVPixelBufferRelease(outPixelBuffer); 
     } 

     } 

     @catch (NSException *exception) { 
      DLog(@"Catching exception: %@", exception); 
     } 

    }]; 

} 
+0

你CIContext選項是倒退。我猜你想寫'CIContext * offlineRenderContext = [CIContext contextWithOptions:@ {kCIContextUseSoftwareRenderer:@NO}];' –

+0

當然是的。我在帖子中糾正了它。 –

回答

12

好吧,我想我解決了它自己。壞人是這條線:

[writerVideoInput requestMediaDataWhenReadyOnQueue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0) usingBlock:^{ .... 

我傳遞的全局隊列是併發隊列。這允許在前一個回調完成之前進行新的回調。資產編寫器的設計目的不是一次寫入多個線程。

創建和使用新的序列隊列似乎來解決這個問題:

assetWriterQueue = dispatch_queue_create("AssetWriterQueue", DISPATCH_QUEUE_SERIAL); 

[writerVideoInput requestMediaDataWhenReadyOnQueue:assetWriterQueue usingBlock:^{...