2012-11-20 56 views
7

我一直在努力解決在iOS設備上捕獲期間和之後控制視頻方向的問題。感謝之前Apple提供的答案和文檔,我已經能夠弄清楚了。但是,現在我想將一些視頻推送到一個網站,我遇到了特別的問題。我已經概述了這個問題,特別是in this question,並且the proposed solution證明需要在視頻編碼過程中設置方向選項。iOS AVFoundation:設置視頻的方向

這可能是,但我不知道如何去做這件事。有關設置方向的文檔涉及正確設置,以便在設備上顯示,並且我已經實施了建議found here.但是,此建議並未解決爲非Apple軟件(如VLC或Chrome瀏覽器)正確設置方向。

任何人都可以提供深入瞭解如何在設備上正確設置方向,使其能夠正確顯示所有查看軟件嗎?

+2

的實際數據總是有拍攝過程中的靜態定位。方向存儲在'preferredTransform'值中。所以,我想,你需要導出視頻來旋轉數據。我會研究'AVAssetExportSession'' AVMutableVideoComposition'' setTransform:atTime:',這可能會有所幫助。 – Davyd

+0

我向蘋果公司提供技術支持事件請求,以幫助解決此問題。但我會看你的建議。這是否意味着在錄製視頻後單獨編碼的步驟,我想知道?這可能在計算上很昂貴... –

+0

是的,這將是一個額外的步驟。但是,如果在不更改原始編碼的情況下導出,則可能並不昂貴。讓我知道你是否找到更好的解決方案。 – Davyd

回答

3

萬一別人是尋找這個答案好,這是我炮製方法(修改了一下,以簡化):

- (void)encodeVideoOrientation:(NSURL *)anOutputFileURL 
{ 
CGAffineTransform rotationTransform; 
CGAffineTransform rotateTranslate; 
CGSize renderSize; 

switch (self.recordingOrientation) 
{ 
    // set these 3 values based on orientation 

} 


AVURLAsset * videoAsset = [[AVURLAsset alloc]initWithURL:anOutputFileURL options:nil]; 

AVAssetTrack *sourceVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; 
AVAssetTrack *sourceAudioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]; 

AVMutableComposition* composition = [AVMutableComposition composition]; 

AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; 
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) 
           ofTrack:sourceVideoTrack 
           atTime:kCMTimeZero error:nil]; 
[compositionVideoTrack setPreferredTransform:sourceVideoTrack.preferredTransform]; 

AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio 
                      preferredTrackID:kCMPersistentTrackID_Invalid]; 
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) 
           ofTrack:sourceAudioTrack 
           atTime:kCMTimeZero error:nil]; 



AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; 
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack]; 
[layerInstruction setTransform:rotateTranslate atTime:kCMTimeZero]; 

AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition]; 
videoComposition.frameDuration = CMTimeMake(1,30); 
videoComposition.renderScale = 1.0; 
videoComposition.renderSize = renderSize; 
instruction.layerInstructions = [NSArray arrayWithObject: layerInstruction]; 
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, videoAsset.duration); 
videoComposition.instructions = [NSArray arrayWithObject: instruction]; 

AVAssetExportSession * assetExport = [[AVAssetExportSession alloc] initWithAsset:composition 
                     presetName:AVAssetExportPresetMediumQuality]; 

NSString* videoName = @"export.mov"; 
NSString *exportPath = [NSTemporaryDirectory() stringByAppendingPathComponent:videoName]; 

NSURL * exportUrl = [NSURL fileURLWithPath:exportPath]; 

if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath]) 
{ 
    [[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil]; 
} 

assetExport.outputFileType = AVFileTypeMPEG4; 
assetExport.outputURL = exportUrl; 
assetExport.shouldOptimizeForNetworkUse = YES; 
assetExport.videoComposition = videoComposition; 

[assetExport exportAsynchronouslyWithCompletionHandler: 
^(void) { 
    switch (assetExport.status) 
    { 
     case AVAssetExportSessionStatusCompleted: 
      //    export complete 
      NSLog(@"Export Complete"); 
      break; 
     case AVAssetExportSessionStatusFailed: 
      NSLog(@"Export Failed"); 
      NSLog(@"ExportSessionError: %@", [assetExport.error localizedDescription]); 
      //    export error (see exportSession.error) 
      break; 
     case AVAssetExportSessionStatusCancelled: 
      NSLog(@"Export Failed"); 
      NSLog(@"ExportSessionError: %@", [assetExport.error localizedDescription]); 
      //    export cancelled 
      break; 
    } 
}]; 

} 

這東西是記錄不完整,很遺憾,但串起來自其他SO問題和閱讀頭文件的例子,我能夠得到這個工作。希望這可以幫助其他人!

2

使用這些下面method根據videoassetorientation設置correctorientationAVMutableVideoComposition

-(AVMutableVideoComposition *) getVideoComposition:(AVAsset *)asset 
{ 
    AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; 
    AVMutableComposition *composition = [AVMutableComposition composition]; 
    AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition]; 
    CGSize videoSize = videoTrack.naturalSize; 
    BOOL isPortrait_ = [self isVideoPortrait:asset]; 
    if(isPortrait_) { 
     NSLog(@"video is portrait "); 
     videoSize = CGSizeMake(videoSize.height, videoSize.width); 
    } 
    composition.naturalSize  = videoSize; 
    videoComposition.renderSize = videoSize; 
    // videoComposition.renderSize = videoTrack.naturalSize; // 
    videoComposition.frameDuration = CMTimeMakeWithSeconds(1/videoTrack.nominalFrameRate, 600); 

    AVMutableCompositionTrack *compositionVideoTrack; 
    compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; 
    [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:videoTrack atTime:kCMTimeZero error:nil]; 
    AVMutableVideoCompositionLayerInstruction *layerInst; 
    layerInst = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack]; 
    [layerInst setTransform:videoTrack.preferredTransform atTime:kCMTimeZero]; 
    AVMutableVideoCompositionInstruction *inst = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; 
    inst.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration); 
    inst.layerInstructions = [NSArray arrayWithObject:layerInst]; 
    videoComposition.instructions = [NSArray arrayWithObject:inst]; 
    return videoComposition; 
} 


-(BOOL) isVideoPortrait:(AVAsset *)asset 
{ 
    BOOL isPortrait = FALSE; 
    NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo]; 
    if([tracks count] > 0) { 
    AVAssetTrack *videoTrack = [tracks objectAtIndex:0]; 

    CGAffineTransform t = videoTrack.preferredTransform; 
    // Portrait 
    if(t.a == 0 && t.b == 1.0 && t.c == -1.0 && t.d == 0) 
    { 
     isPortrait = YES; 
    } 
    // PortraitUpsideDown 
    if(t.a == 0 && t.b == -1.0 && t.c == 1.0 && t.d == 0) { 

     isPortrait = YES; 
    } 
    // LandscapeRight 
    if(t.a == 1.0 && t.b == 0 && t.c == 0 && t.d == 1.0) 
    { 
     isPortrait = FALSE; 
    } 
    // LandscapeLeft 
    if(t.a == -1.0 && t.b == 0 && t.c == 0 && t.d == -1.0) 
    { 
     isPortrait = FALSE; 
    } 
    } 
    return isPortrait; 
} 
1

由於iOS 5中則可以使用AVCaptureVideoDataOutput記載here請求旋轉CVPixelBuffers。這爲您提供了正確的方向,而無需再次使用AVAssetExportSession重新處理視頻。

5

在蘋果的文檔here它指出:

客戶現在可以在其AVCaptureVideoDataOutput -captureOutput接受物理旋轉CVPixelBuffers:didOutputSampleBuffer:fromConnection:委託回調。在以前的iOS版本中,前置攝像頭將始終在AVCaptureVideoOrientationLandscapeLeft中提供緩衝區,而後置攝像頭將始終在AVCaptureVideoOrientationLandscapeRight中提供緩衝區。所有4個AVCaptureVideoOrientations都支持,並且旋轉是硬件加速的。要請求緩衝區旋轉,客戶端在AVCaptureVideoDataOutput的視頻AVCaptureConnection上調用-setVideoOrientation:。請注意,物理旋轉緩衝區確實會帶來性能成本,因此只有在必要時才請求旋轉。例如,如果您想要使用AVAssetWriter將旋轉視頻寫入QuickTime電影文件,則最好在AVAssetWriterInput上設置-transform屬性,而不是物理旋轉AVCaptureVideoDataOutput中的緩衝區。

因此,由Aaron Vegh發佈的使用AVAssetExportSession的解決方案有效,但不是必需的。就像Apple文檔所說的,如果您想要正確設置方向,以便在非蘋果QuickTime播放器(如VLC或使用Chrome瀏覽器)中播放,則必須在AVCaptureConnection上爲AVCaptureVideoDataOutput設置視頻方向。如果您嘗試將其設置爲AVAssetWriterInput,您將獲得VLC和Chrome等玩家的不正確方向。

這裏是我的代碼,我將它設置捕獲會話期間:

// DECLARED AS PROPERTIES ABOVE 
@property (strong,nonatomic) AVCaptureDeviceInput *audioIn; 
@property (strong,nonatomic) AVCaptureAudioDataOutput *audioOut; 
@property (strong,nonatomic) AVCaptureDeviceInput *videoIn; 
@property (strong,nonatomic) AVCaptureVideoDataOutput *videoOut; 
@property (strong,nonatomic) AVCaptureConnection *audioConnection; 
@property (strong,nonatomic) AVCaptureConnection *videoConnection; 
------------------------------------------------------------------ 
------------------------------------------------------------------ 

-(void)setupCaptureSession{ 
// Setup Session 
self.session = [[AVCaptureSession alloc]init]; 
[self.session setSessionPreset:AVCaptureSessionPreset640x480]; 

// Create Audio connection ---------------------------------------- 
self.audioIn = [[AVCaptureDeviceInput alloc]initWithDevice:[self getAudioDevice] error:nil]; 
if ([self.session canAddInput:self.audioIn]) { 
    [self.session addInput:self.audioIn]; 
} 

self.audioOut = [[AVCaptureAudioDataOutput alloc]init]; 
dispatch_queue_t audioCaptureQueue = dispatch_queue_create("Audio Capture Queue", DISPATCH_QUEUE_SERIAL); 
[self.audioOut setSampleBufferDelegate:self queue:audioCaptureQueue]; 
if ([self.session canAddOutput:self.audioOut]) { 
    [self.session addOutput:self.audioOut]; 
} 
self.audioConnection = [self.audioOut connectionWithMediaType:AVMediaTypeAudio]; 

// Create Video connection ---------------------------------------- 
self.videoIn = [[AVCaptureDeviceInput alloc]initWithDevice:[self videoDeviceWithPosition:AVCaptureDevicePositionBack] error:nil]; 
if ([self.session canAddInput:self.videoIn]) { 
    [self.session addInput:self.videoIn]; 
} 

self.videoOut = [[AVCaptureVideoDataOutput alloc]init]; 
[self.videoOut setAlwaysDiscardsLateVideoFrames:NO]; 
[self.videoOut setVideoSettings:nil]; 
dispatch_queue_t videoCaptureQueue = dispatch_queue_create("Video Capture Queue", DISPATCH_QUEUE_SERIAL); 
[self.videoOut setSampleBufferDelegate:self queue:videoCaptureQueue]; 
if ([self.session canAddOutput:self.videoOut]) { 
    [self.session addOutput:self.videoOut]; 
} 

self.videoConnection = [self.videoOut connectionWithMediaType:AVMediaTypeVideo]; 
// SET THE ORIENTATION HERE ------------------------------------------------- 
[self.videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait]; 
// -------------------------------------------------------------------------- 

// Create Preview Layer ------------------------------------------- 
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:self.session]; 
CGRect bounds = self.videoView.bounds; 
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; 
previewLayer.bounds = bounds; 
previewLayer.position=CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds)); 
[self.videoView.layer addSublayer:previewLayer]; 

// Start session 
[self.session startRunning]; 

}

+0

我很樂意使用這種解決方案。它會解決很多問題。文檔暗示「性能成本」。你的經驗成本是否顯着?我很想知道具有最高比特率的預設(例如iPhone 5s上的每秒120幀的高清分辨率)? – otto

9

最後,基於@Aaron Vegh和@Prince的答案,我想通了,我的決心: //轉換視頻

+(void)convertMOVToMp4:(NSString *)movFilePath completion:(void (^)(NSString *mp4FilePath))block{ 


AVURLAsset * videoAsset = [[AVURLAsset alloc]initWithURL:[NSURL fileURLWithPath:movFilePath] options:nil]; 

AVAssetTrack *sourceAudioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]; 

AVMutableComposition* composition = [AVMutableComposition composition]; 


AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio 
                      preferredTrackID:kCMPersistentTrackID_Invalid]; 
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) 
           ofTrack:sourceAudioTrack 
           atTime:kCMTimeZero error:nil]; 




AVAssetExportSession * assetExport = [[AVAssetExportSession alloc] initWithAsset:composition 
                     presetName:AVAssetExportPresetMediumQuality]; 


NSString *exportPath = [movFilePath stringByReplacingOccurrencesOfString:@".MOV" withString:@".mp4"]; 


NSURL * exportUrl = [NSURL fileURLWithPath:exportPath]; 


assetExport.outputFileType = AVFileTypeMPEG4; 
assetExport.outputURL = exportUrl; 
assetExport.shouldOptimizeForNetworkUse = YES; 
assetExport.videoComposition = [self getVideoComposition:videoAsset composition:composition]; 

[assetExport exportAsynchronouslyWithCompletionHandler: 
^(void) { 
    switch (assetExport.status) 
    { 
     case AVAssetExportSessionStatusCompleted: 
      //    export complete 
        if (block) { 
         block(exportPath); 
       } 
      break; 
     case AVAssetExportSessionStatusFailed: 
      block(nil); 
      break; 
     case AVAssetExportSessionStatusCancelled: 
      block(nil); 
      break; 
    } 
}]; 
} 

//獲取當前方向

+(AVMutableVideoComposition *) getVideoComposition:(AVAsset *)asset composition:(AVMutableComposition*)composition{ 
    BOOL isPortrait_ = [self isVideoPortrait:asset]; 


    AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; 


    AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; 
    [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:videoTrack atTime:kCMTimeZero error:nil]; 

    AVMutableVideoCompositionLayerInstruction *layerInst = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack]; 

    CGAffineTransform transform = videoTrack.preferredTransform; 
    [layerInst setTransform:transform atTime:kCMTimeZero]; 


    AVMutableVideoCompositionInstruction *inst = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; 
    inst.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration); 
    inst.layerInstructions = [NSArray arrayWithObject:layerInst]; 


    AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition]; 
    videoComposition.instructions = [NSArray arrayWithObject:inst]; 

    CGSize videoSize = videoTrack.naturalSize; 
    if(isPortrait_) { 
     NSLog(@"video is portrait "); 
     videoSize = CGSizeMake(videoSize.height, videoSize.width); 
    } 
    videoComposition.renderSize = videoSize; 
    videoComposition.frameDuration = CMTimeMake(1,30); 
    videoComposition.renderScale = 1.0; 
    return videoComposition; 
    } 

//獲取視頻

+(BOOL) isVideoPortrait:(AVAsset *)asset{ 
BOOL isPortrait = FALSE; 
NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo]; 
if([tracks count] > 0) { 
    AVAssetTrack *videoTrack = [tracks objectAtIndex:0]; 

    CGAffineTransform t = videoTrack.preferredTransform; 
    // Portrait 
    if(t.a == 0 && t.b == 1.0 && t.c == -1.0 && t.d == 0) 
    { 
     isPortrait = YES; 
    } 
    // PortraitUpsideDown 
    if(t.a == 0 && t.b == -1.0 && t.c == 1.0 && t.d == 0) { 

     isPortrait = YES; 
    } 
    // LandscapeRight 
    if(t.a == 1.0 && t.b == 0 && t.c == 0 && t.d == 1.0) 
    { 
     isPortrait = FALSE; 
    } 
    // LandscapeLeft 
    if(t.a == -1.0 && t.b == 0 && t.c == 0 && t.d == -1.0) 
    { 
     isPortrait = FALSE; 
    } 
} 
return isPortrait; 

}

+0

嗨@Jagie謝謝你的示例代碼。除了使用前置攝像頭時,我得到了這個工作,導出的視頻只顯示一半。這意味着我需要沿着y軸向下平移它。如果我使用後置攝像頭,這很好。你有沒有類似的問題與前置攝像頭? – doorman

+0

當我嘗試你的代碼時,它給了我一個錯誤「視頻不能組成。」 –