2011-05-31 74 views
12

我正在嘗試使用AVAssetWriter將CGImages寫入文件以從圖像創建視頻。AVAssetWriter Woes

我已經在模擬器上以三種不同方式成功工作,但是每個方法在運行iOS 4.3的iPhone 4上都失敗。

這一切都與像素緩衝區有關。

我的第一種方法是根據需要創建像素緩衝區而不使用池。這很有效,但對設備工作而言,內存密集程度太高。

我的第二種方法是使用推薦的AVAssetWriterInputPixelBufferAdaptor,然後使用CVPixelBufferPoolCreatePixelBuffer從適配器pixelBufferPool中提取像素緩衝區。

這也適用於模擬器,但在設備上失敗,因爲適配器的像素緩衝池永遠不會被分配。我沒有收到錯誤消息。

最後,我試圖用CVPixelBufferPoolCreate創建自己的像素緩衝池。這也適用於模擬器,但在設備上,一切工作正常,直到我試圖appendPixelBuffer追加像素緩衝區,每次失敗。

我在網上發現了非常簡單的信息。我將我的代碼基於我找到的例子,但現在幾天沒有運氣。如果任何人都有成功使用AVAssetWriter的經驗,請看看並讓我知道,如果你看到任何不合適的地方。

注:您將看到註釋掉的嘗試塊。

首先,設置

- (BOOL) openVideoFile: (NSString *) path withSize:(CGSize)imageSize { 
size = CGSizeMake (480.0, 320.0);//imageSize; 

NSError *error = nil; 
videoWriter = [[AVAssetWriter alloc] initWithURL: 
           [NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie 
                  error:&error]; 
if (error != nil) 
    return NO; 

NSDictionary *videoCleanApertureSettings = [NSDictionary dictionaryWithObjectsAndKeys: 
              [NSNumber numberWithDouble:size.width], AVVideoCleanApertureWidthKey, 
              [NSNumber numberWithDouble:size.height], AVVideoCleanApertureHeightKey, 
              [NSNumber numberWithInt:10], AVVideoCleanApertureHorizontalOffsetKey, 
              [NSNumber numberWithInt:10], AVVideoCleanApertureVerticalOffsetKey, 
              nil]; 


NSDictionary *videoAspectRatioSettings = [NSDictionary dictionaryWithObjectsAndKeys: 
              [NSNumber numberWithInt:1], AVVideoPixelAspectRatioHorizontalSpacingKey, 
              [NSNumber numberWithInt:1],AVVideoPixelAspectRatioVerticalSpacingKey, 
              nil]; 



NSDictionary *codecSettings = [NSDictionary dictionaryWithObjectsAndKeys: 
           //[NSNumber numberWithInt:960000], AVVideoAverageBitRateKey, 
           // [NSNumber numberWithInt:1],AVVideoMaxKeyFrameIntervalKey, 
           videoCleanApertureSettings, AVVideoCleanApertureKey, 
           videoAspectRatioSettings, AVVideoPixelAspectRatioKey, 
           //AVVideoProfileLevelH264Main31, AVVideoProfileLevelKey, 
           nil]; 

NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: 
           AVVideoCodecH264, AVVideoCodecKey, 
           codecSettings,AVVideoCompressionPropertiesKey, 
           [NSNumber numberWithDouble:size.width], AVVideoWidthKey, 
           [NSNumber numberWithDouble:size.height], AVVideoHeightKey, 
           nil]; 
writerInput = [[AVAssetWriterInput 
            assetWriterInputWithMediaType:AVMediaTypeVideo 
            outputSettings:videoSettings] retain]; 
NSMutableDictionary * bufferAttributes = [[NSMutableDictionary alloc] init]; 
[bufferAttributes setObject: [NSNumber numberWithInt: kCVPixelFormatType_32ARGB] 
        forKey: (NSString *) kCVPixelBufferPixelFormatTypeKey]; 
[bufferAttributes setObject: [NSNumber numberWithInt: 480] 
        forKey: (NSString *) kCVPixelBufferWidthKey]; 
[bufferAttributes setObject: [NSNumber numberWithInt: 320] 
        forKey: (NSString *) kCVPixelBufferHeightKey]; 


//NSDictionary *bufferAttributes = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil]; 
//[bufferAttributes setObject: [NSNumber numberWithInt: 640] 
//     forKey: (NSString *) kCVPixelBufferWidthKey]; 
//[bufferAttributes setObject: [NSNumber numberWithInt: 480] 
//     forKey: (NSString *) kCVPixelBufferHeightKey]; 
adaptor = [[AVAssetWriterInputPixelBufferAdaptor 
      assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput 
      sourcePixelBufferAttributes:nil] retain]; 

//CVPixelBufferPoolCreate (kCFAllocatorSystemDefault,NULL,(CFDictionaryRef)bufferAttributes,&pixelBufferPool); 
//Create buffer pool 
NSMutableDictionary*  attributes; 
attributes = [NSMutableDictionary dictionary]; 

int width = 480; 
int height = 320; 

[attributes setObject:[NSNumber numberWithInt:kCVPixelFormatType_32ARGB] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey]; 
[attributes setObject:[NSNumber numberWithInt:width] forKey: (NSString*)kCVPixelBufferWidthKey]; 
[attributes setObject:[NSNumber numberWithInt:height] forKey: (NSString*)kCVPixelBufferHeightKey]; 
CVReturn theError = CVPixelBufferPoolCreate(kCFAllocatorDefault, NULL, (CFDictionaryRef) attributes, &pixelBufferPool);           


NSParameterAssert(writerInput); 
NSParameterAssert([videoWriter canAddInput:writerInput]); 
[videoWriter addInput:writerInput]; 

writerInput.expectsMediaDataInRealTime = YES; 

//Start a session: 
[videoWriter startWriting]; 
[videoWriter startSessionAtSourceTime:kCMTimeZero]; 

buffer = NULL; 
lastTime = kCMTimeZero; 
presentTime = kCMTimeZero; 

return YES; 
} 

接着,追加寫入器和創建像素緩衝器追加的兩種方法。

- (void) writeImageToMovie:(CGImageRef)image 
{ 
    if([writerInput isReadyForMoreMediaData]) 
    { 
//   CMTime frameTime = CMTimeMake(1, 20); 
//   CMTime lastTime=CMTimeMake(i, 20); //i is from 0 to 24 of the loop above 
//   CMTime presentTime=CMTimeAdd(lastTime, frameTime); 

     buffer = [self pixelBufferFromCGImage:image]; 
     BOOL success = [adaptor appendPixelBuffer:buffer withPresentationTime:presentTime]; 
     if (!success) NSLog(@"Failed to appendPixelBuffer"); 
     CVPixelBufferRelease(buffer); 

     presentTime = CMTimeAdd(lastTime, CMTimeMake(5, 1000)); 
     lastTime = presentTime; 
    } 
    else 
    { 
     NSLog(@"error - writerInput not ready"); 
    } 
} 

- (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image 
{ 
CVPixelBufferRef pxbuffer; 
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: 
         [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, 
         [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, 
         nil]; 
if (pixelBufferPool == NULL) NSLog(@"pixelBufferPool is null!"); 
CVReturn status = CVPixelBufferPoolCreatePixelBuffer (NULL, pixelBufferPool, &pxbuffer); 
/*if (pxbuffer == NULL) { 
    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width, 
             size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options, 
             &pxbuffer); 

}*/ 
//NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL); 


CVPixelBufferLockBaseAddress(pxbuffer, 0); 
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer); 
//NSParameterAssert(pxdata != NULL); 

CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB(); 
CGContextRef context = CGBitmapContextCreate(pxdata, size.width, 
              size.height, 8, 4*size.width, rgbColorSpace, 
              kCGImageAlphaNoneSkipFirst); 
//NSParameterAssert(context); 
CGContextConcatCTM(context, CGAffineTransformMakeRotation(0)); 
CGContextDrawImage(context, CGRectMake(90, 10, CGImageGetWidth(image), 
             CGImageGetHeight(image)), image); 
CGColorSpaceRelease(rgbColorSpace); 
CGContextRelease(context); 

CVPixelBufferUnlockBaseAddress(pxbuffer, 0); 

return pxbuffer; 
} 
+0

哇,這是dissapointing。沒有人有線索? – 2011-06-02 22:25:47

回答

2

我找到了解決這個問題的方法。

如果您想讓AVAudioPlayer和AVAssetWriter在一起正確行爲,您必須具有「可混合」音頻會話類別。

您可以使用類似AVAudioSessionCategoryAmbient的混合類別。

但是,我需要使用AVAudioSessionCategoryPlayAndRecord。

您可以設置任何類別是通過實施這個混合的:

OSStatus propertySetError = 0; 

UInt32 allowMixing = true; 

propertySetError = AudioSessionSetProperty (
         kAudioSessionProperty_OverrideCategoryMixWithOthers, // 1 
         sizeof (allowMixing),         // 2 
         &allowMixing           // 3 
        ); 
+4

我不太明白這個解決方案是如何與您的原始代碼相關的。 – ninjudd 2015-04-05 05:30:40

1

哦,首先你需要通過一些bufferAttributes創建適配器對象時:

NSDictionary *bufferAttributes = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey, nil]; 

AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor 
               assetWriterInputPixelBufferAdaptorWithAssetWriterInput:_videoWriterInput 
               sourcePixelBufferAttributes:bufferAttributes]; 

然後刪除該調用CVPixelBufferPoolCreate,還有已經在適配器對象創建了一個像素的緩衝池,所以呼叫只是這個代替:

   CVPixelBufferRef pixelBuffer = NULL; 
      CVPixelBufferPoolCreatePixelBuffer(NULL, adaptor.pixelBufferPool, &pixelBuffer); 
      CVPixelBufferLockBaseAddress(pixelBuffer, 0); 

      // ...fill the pixelbuffer here 

      CVPixelBufferUnlockBaseAddress(pixelBuffer, 0); 

      CMTime frameTime = CMTimeMake(frameCount,(int32_t) 30); 

      BOOL res = [adaptor appendPixelBuffer:pixelBuffer withPresentationTime:frameTime]; 
      CVPixelBufferRelease(pixelBuffer); 
      CFRelease(sampleBuffer); 

我認爲應該這樣做,我在某些時候也有類似的錯誤,我通過創建適配器和PIX解決它el緩衝區,如圖所示..

+0

除了上面的建議之外,還應測試以確保在代碼中調用startSessionAtSourceTime之後成功創建了像素緩衝區。你的代碼應該檢查「if(adaptor.pixelBufferPool == nil)」,然後在進入編碼幀循環之前處理失敗情況。 – MoDJ 2012-07-21 22:17:34