2011-08-04 65 views
3

我已經閱讀了所有可以在互聯網上找到的關於此功能的文章,並且我已經創建了一些視頻文件,但我仍然有3個問題存在,似乎沒有人提到過那。使用UIImage和caf創建視頻文件的問題

我有3個問題:

  1. 視頻無法在某些播放器正常播放爲:QuickTime(窗口),視頻播放只有一個框架和屏幕變成白色,且視頻不能在youtube上播放。

  2. 一些圖像,由於某種原因,圖像很變態

    http://lh3.googleusercontent.com/-Jyz-L1k3MEk/TjpfSfKf8LI/AAAAAAAADBs/D1GYuEqI-Oo/h301/1.JPG(好吧,他們說我是一個新用戶,並沒有讓我張貼圖片在後。)

  3. 某些圖像,出於某種原因,方向是不正確的,即使我根據方向轉換上下文,它仍然無法正常工作。

有人可以幫助我,請提前非常感謝!

這裏是我的代碼:

1:使用此功能來創建UIImage的視頻,我只用了一個形象,以及1個音頻文件(CAF),我想表明的圖像,同時播放音頻。

- (void)writeImageAndAudioAsMovie:(UIImage*)image andAudio:(NSString *)audioFilePath duration:(int)duration { 
    NSLog(@"start make movie: length:%d",duration); 
    NSError *error = nil; 
    AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:ImageVideoPath] fileType:AVFileTypeMPEG4 
                  error:&error]; 
    NSParameterAssert(videoWriter); 
    if ([[NSFileManager defaultManager] fileExistsAtPath:ImageVideoPath]) 
     [[NSFileManager defaultManager] removeItemAtPath:ImageVideoPath error:nil]; 

    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecH264, AVVideoCodecKey, 
           [NSNumber numberWithInt:image.size.width],AVVideoWidthKey,[NSNumber numberWithInt:image.size.height], AVVideoHeightKey,nil]; 
    AVAssetWriterInput* writerInput = [[AVAssetWriterInput 
            assetWriterInputWithMediaType:AVMediaTypeVideo 
            outputSettings:videoSettings] retain]; 

    AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:nil]; 
    NSParameterAssert(writerInput); 
    NSParameterAssert([videoWriter canAddInput:writerInput]); 
    writerInput.expectsMediaDataInRealTime = YES; 
    [videoWriter setShouldOptimizeForNetworkUse:YES]; 
    [videoWriter addInput:writerInput]; 

    //Start a session: 
    [videoWriter startWriting]; 
    [videoWriter startSessionAtSourceTime:kCMTimeZero]; 

    //Write samples: 
    CVPixelBufferRef buffer = [self pixelBufferFromCGImage:image.CGImage]; 
    [adaptor appendPixelBuffer:buffer withPresentationTime:kCMTimeZero]; 

    //Finish the session: 
    [videoWriter endSessionAtSourceTime:CMTimeMake(duration, 1)]; 
    [writerInput markAsFinished]; 
    [videoWriter finishWriting]; 

    CVPixelBufferPoolRelease(adaptor.pixelBufferPool); 
    [videoWriter release]; 
    [writerInput release]; 
    [self addAudioToFileAtPath:ImageVideoPath andAudioPath:audioFilePath]; 
} 

2.視頻

創建CVPixelBufferRef
-(CVPixelBufferRef)pixelBufferFromCGImage: (CGImageRef) image{ 
    float width = CGImageGetWidth(cgimage); 
    float height = CGImageGetHeight(cgimage); 

    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: 
         [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, 
         [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, 
         nil]; 
    CVPixelBufferRef pxbuffer = NULL; 
    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, width,height, kCVPixelFormatType_32ARGB,(CFDictionaryRef)options,&pxbuffer); 

    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL); 

    CVPixelBufferLockBaseAddress(pxbuffer, 0); 
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer); 

    NSParameterAssert(pxdata != NULL); 

    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB(); 
    CGContextRef context = CGBitmapContextCreate(pxdata,width,height,8,4*width,rgbColorSpace,kCGImageAlphaNoneSkipFirst); 

    NSParameterAssert(context); 
    CGContextDrawImage(context, CGRectMake(0, 0,width, height), cgimage); 

    CGColorSpaceRelease(rgbColorSpace); 
    CGContextRelease(context); 

    CVPixelBufferUnlockBaseAddress(pxbuffer, 0); 

    return pxbuffer; 
} 

3.假如這個問題放視頻和音頻一起

-(void) addAudioToFileAtPath:(NSString *)vidoPath andAudioPath:(NSString *)audioPath{ 
    AVMutableComposition* mixComposition = [AVMutableComposition composition]; 

    NSURL* audio_inputFileUrl = [NSURL fileURLWithPath:audioPath]; 
    NSURL* video_inputFileUrl = [NSURL fileURLWithPath:vidoPath]; 

    NSString *outputFilePath = FinalVideoPath; 
    NSURL* outputFileUrl = [NSURL fileURLWithPath:outputFilePath]; 

    if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath]) 
     [[NSFileManager defaultManager] removeItemAtPath:outputFilePath error:nil]; 

    AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audio_inputFileUrl options:nil]; 
    CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration); 
    AVMutableCompositionTrack *b_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; 
    [b_compositionAudioTrack insertTimeRange:audio_timeRange ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil]; 


    AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:video_inputFileUrl options:nil]; 
    CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration); 
    AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; 
    [a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil]; 

    //nextClipStartTime = CMTimeAdd(nextClipStartTime, a_timeRange.duration); 
    [audioAsset release];audioAsset = nil; 
    [videoAsset release];videoAsset = nil; 

    AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality]; 
    _assetExport.outputFileType = AVFileTypeQuickTimeMovie; 
    _assetExport.outputURL = outputFileUrl; 

    [_assetExport exportAsynchronouslyWithCompletionHandler: 
    ^(void) { 
     switch (_assetExport.status) 
     { 
      case AVAssetExportSessionStatusCompleted: 
       //export complete 
       NSLog(@"Export Complete"); 
       break; 
      case AVAssetExportSessionStatusFailed: 
       NSLog(@"Export Failed"); 
       NSLog(@"ExportSessionError: %@", [_assetExport.error localizedDescription]); 
      //export error (see exportSession.error) 
       break; 
      case AVAssetExportSessionStatusCancelled: 
       NSLog(@"Export Failed"); 
       NSLog(@"ExportSessionError: %@", [_assetExport.error localizedDescription]); 
       //export cancelled 
       break; 
     } 
     }];  
} 
+0

我可以幫助你的方向問題,請參閱我的文章的答案http://stackoverflow.com/questions/11414351/avmutable-composition-lost-orientation-when-adding-audio-to-a-video-用 – Jad

回答

2

。這裏是什麼是你勾選列表,如果你想讓它變得固定:

1)視頻不能有alpha通道讓你pixelBufferFromCGImage應該是這樣的

static OSType pixelFormatType = kCVPixelFormatType_32ARGB; 


- (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image { 
    CGSize frameSize = CGSizeMake(CGImageGetWidth(image), CGImageGetHeight(image)); 
    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: 
          @YES, kCVPixelBufferCGImageCompatibilityKey, 
          @YES, kCVPixelBufferCGBitmapContextCompatibilityKey, 
          nil]; 
    CVPixelBufferRef pxbuffer = NULL; 
    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, 
              frameSize.width, 
              frameSize.height, 
              pixelFormatType, 
              (__bridge CFDictionaryRef)options, 
              &pxbuffer); 
    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL); 

    CVPixelBufferLockBaseAddress(pxbuffer, 0); 
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer); 

    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB(); 
    CGBitmapInfo bitmapInfo = kCGImageAlphaNoneSkipFirst & kCGBitmapAlphaInfoMask; 

    //NSUInteger bytesPerRow = 4 * frameSize.width; 
    NSUInteger bitsPerComponent = 8; 
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(pxbuffer); 

    CGContextRef context = CGBitmapContextCreate(pxdata, 
               frameSize.width, 
               frameSize.height, 
               bitsPerComponent, 
               bytesPerRow, 
               rgbColorSpace, 
               bitmapInfo); 

    CGContextDrawImage(context, CGRectMake(0, 0, frameSize.width, frameSize.height), image); 
    CGColorSpaceRelease(rgbColorSpace); 
    CGContextRelease(context); 

    CVPixelBufferUnlockBaseAddress(pxbuffer, 0); 

    return pxbuffer; 
} 

2)請確保您在實際測試設備。模擬器往往會扭曲視頻,我在使用模擬器製作視頻時遇到了完全相同的問題。

3)確保您創建AVAssetWriterInputPixelBufferAdaptor這樣

NSMutableDictionary *attributes = [[NSMutableDictionary alloc] init]; 
[attributes setObject:[NSNumber numberWithUnsignedInt:pixelFormatType] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey]; 
[attributes setObject:[NSNumber numberWithUnsignedInt:imageSize.width] forKey:(NSString*)kCVPixelBufferWidthKey]; 
[attributes setObject:[NSNumber numberWithUnsignedInt:imageSize.height] forKey:(NSString*)kCVPixelBufferHeightKey]; 

AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor 
                  assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput 
                  sourcePixelBufferAttributes:attributes]; 

我還是有一些其他問題,但沒有失真的視頻或定向發行。如果您未直接從資產請求縮略圖圖像,則需要將圖像旋轉到正確的方向。

+0

製作你是上帝。只有當圖像處於肖像模式時,我纔有問題,並且風景正常。我發現問題,我發現實際問題是在阿爾法設置。謝謝。 – khunshan