2014-02-18 75 views
5

我使用AVCaptureSession從設備攝像頭捕獲視頻,然後使用AVAssetWriterInputAVAssetTrack在將視頻上傳到服務器之前對其進行壓縮/調整大小。最終的視頻將通過html5視頻元素在網絡上進行查看。AVFoundation - 爲什麼我無法獲得視頻方向

我遇到了多個問題,試圖讓視頻的方向正確。我的應用程序僅支持橫向導航,所有捕捉的視頻都應處於橫向模式。但是,我希望允許用戶在任何橫向(即左側或右側的主頁按鈕)上握住設備。

我能夠使視頻預覽顯示在正確的方向與下面的代碼行

_previewLayer.connection.videoOrientation = UIDevice.currentDevice.orientation; 

的問題,通過AVAssetWriterInput和朋友在處理視頻時啓動。結果似乎沒有考慮到視頻被捕獲到的左側和右側風景模式。IOW,有時視頻出現顛倒。一些谷歌搜索後,我發現很多人建議,下面的代碼行將解決這個問題

writerInput.transform = videoTrack.preferredTransform; 

......但這似乎並不奏效。有點調試後,我發現,videoTrack.preferredTransform始終是相同的值,而不管取向的視頻在。

捕捉我試圖手動跟蹤什麼方位的視頻在捕獲,並根據需要設置writerInput.transformCGAffineTransformMakeRotation(M_PI)。哪個解決了問題!!!

... sorta

當我在設備上查看結果時,該解決方案按預期工作。無論記錄時的左右方向如何,視頻都是正面朝上的。不幸的是,當我在另一個瀏覽器中查看完全相同的視頻(Mac書上的chrome)時,它們都是顛倒的!?!?!?

我在做什麼錯?

編輯

下面是一些代碼,如果它是有幫助...

-(void)compressFile:(NSURL*)inUrl; 
{     
    NSString* fileName = [@"compressed." stringByAppendingString:inUrl.lastPathComponent]; 
    NSError* error; 
    NSURL* outUrl = [PlatformHelper getFilePath:fileName error:&error]; 

    NSDictionary* compressionSettings = @{ AVVideoProfileLevelKey: AVVideoProfileLevelH264Main31, 
              AVVideoAverageBitRateKey: [NSNumber numberWithInt:2500000], 
              AVVideoMaxKeyFrameIntervalKey: [NSNumber numberWithInt: 30] }; 

    NSDictionary* videoSettings = @{ AVVideoCodecKey: AVVideoCodecH264, 
            AVVideoWidthKey: [NSNumber numberWithInt:1280], 
            AVVideoHeightKey: [NSNumber numberWithInt:720], 
            AVVideoScalingModeKey: AVVideoScalingModeResizeAspectFill, 
            AVVideoCompressionPropertiesKey: compressionSettings }; 

    NSDictionary* videoOptions = @{ (id)kCVPixelBufferPixelFormatTypeKey: [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] }; 


    AVAssetWriterInput* writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings]; 
    writerInput.expectsMediaDataInRealTime = YES; 

    AVAssetWriter* assetWriter = [AVAssetWriter assetWriterWithURL:outUrl fileType:AVFileTypeMPEG4 error:&error]; 
    assetWriter.shouldOptimizeForNetworkUse = YES; 

    [assetWriter addInput:writerInput]; 

    AVURLAsset* asset = [AVURLAsset URLAssetWithURL:inUrl options:nil]; 
    AVAssetTrack* videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; 

    // !!! this line does not work as expected and causes all sorts of issues (videos display sideways in some cases) !!! 
    //writerInput.transform = videoTrack.preferredTransform; 

    AVAssetReaderTrackOutput* readerOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTrack outputSettings:videoOptions]; 
    AVAssetReader* assetReader = [AVAssetReader assetReaderWithAsset:asset error:&error]; 

    [assetReader addOutput:readerOutput]; 

    [assetWriter startWriting]; 
    [assetWriter startSessionAtSourceTime:kCMTimeZero]; 
    [assetReader startReading]; 

    [writerInput requestMediaDataWhenReadyOnQueue:_processingQueue usingBlock: 
    ^{ 
     /* snip */ 
    }]; 
} 

回答

7

的問題是,修改writerInput.transform屬性只增加了一個標籤,其中指示視頻的視頻文件的元數據播放器在播放過程中旋轉文件。這就是爲什麼視頻在設備上以正確的方向播放(我猜他們也可以在Quicktime播放器中正確播放)。

攝像頭捕捉到的像素緩衝區仍按其捕捉方向佈置。許多視頻播放器不會檢查首選的方向元數據標籤,只會以本地像素方向播放文件。

如果您希望用戶能夠以橫向模式錄製手機的視頻,則需要在壓縮之前在AVCaptureSession級別糾正此問題,方法是對每個視頻幀的CVPixelBuffer執行轉換。這款蘋果Q & A覆蓋它(在AVCaptureVideoOutput文檔外觀以及): https://developer.apple.com/library/ios/qa/qa1744/_index.html

調查上面的鏈接是解決問題的正確途徑。備用fast n'dirty解決同樣問題的方法是將應用程序的錄製UI鎖定到一個橫向,然後使用ffmpeg旋轉所有視頻服務器端。

+0

太謝謝你了。 – herbrandson

0

萬一它對任何人都有幫助,這裏是我結束的代碼。我最終不得不在拍攝視頻的過程中對其進行處理,而不是將其作爲後期處理步驟。這是一個管理捕獲的幫助類。

接口

#import <Foundation/Foundation.h> 
#import <AVFoundation/AVFoundation.h> 

@interface VideoCaptureManager : NSObject<AVCaptureVideoDataOutputSampleBufferDelegate> 
{ 
    AVCaptureSession* _captureSession; 
    AVCaptureVideoPreviewLayer* _previewLayer; 
    AVCaptureVideoDataOutput* _videoOut; 
    AVCaptureDevice* _videoDevice; 
    AVCaptureDeviceInput* _videoIn; 
    dispatch_queue_t _videoProcessingQueue; 

    AVAssetWriter* _assetWriter; 
    AVAssetWriterInput* _writerInput; 

    BOOL _isCapturing; 
    NSString* _gameId; 
    NSString* _authToken; 
} 

-(void)setSettings:(NSString*)gameId authToken:(NSString*)authToken; 
-(void)setOrientation:(AVCaptureVideoOrientation)orientation; 
-(AVCaptureVideoPreviewLayer*)getPreviewLayer; 
-(void)startPreview; 
-(void)stopPreview; 
-(void)startCapture; 
-(void)stopCapture; 

@end 

實施(W/A位編輯和一些小TODO的)

@implementation VideoCaptureManager 

-(id)init; 
{ 
    self = [super init]; 
    if (self) { 
     NSError* error; 

     _videoProcessingQueue = dispatch_queue_create("VideoQueue", DISPATCH_QUEUE_SERIAL); 
     _captureSession = [AVCaptureSession new]; 

     _videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; 

     _previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:_captureSession]; 
     [_previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill]; 

     _videoOut = [AVCaptureVideoDataOutput new]; 
     _videoOut.videoSettings = @{ (id)kCVPixelBufferPixelFormatTypeKey: [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] }; 
     _videoOut.alwaysDiscardsLateVideoFrames = YES; 

     _videoIn = [AVCaptureDeviceInput deviceInputWithDevice:_videoDevice error:&error]; 
     // handle errors here 

     [_captureSession addInput:_videoIn]; 
     [_captureSession addOutput:_videoOut]; 
    } 

    return self; 
} 

-(void)setOrientation:(AVCaptureVideoOrientation)orientation; 
{ 
    _previewLayer.connection.videoOrientation = orientation; 
    for (AVCaptureConnection* item in _videoOut.connections) { 
     item.videoOrientation = orientation; 
    } 
} 

-(AVCaptureVideoPreviewLayer*)getPreviewLayer; 
{ 
    return _previewLayer; 
} 

-(void)startPreview; 
{ 
    [_captureSession startRunning]; 
} 

-(void)stopPreview; 
{ 
    [_captureSession stopRunning]; 
} 

-(void)startCapture; 
{ 
    if (_isCapturing) return; 

    NSURL* url = put code here to create your output url 

    NSDictionary* compressionSettings = @{ AVVideoProfileLevelKey: AVVideoProfileLevelH264Main31, 
              AVVideoAverageBitRateKey: [NSNumber numberWithInt:2500000], 
              AVVideoMaxKeyFrameIntervalKey: [NSNumber numberWithInt: 1], 
             }; 

    NSDictionary* videoSettings = @{ AVVideoCodecKey: AVVideoCodecH264, 
            AVVideoWidthKey: [NSNumber numberWithInt:1280], 
            AVVideoHeightKey: [NSNumber numberWithInt:720], 
            AVVideoScalingModeKey: AVVideoScalingModeResizeAspectFill, 
            AVVideoCompressionPropertiesKey: compressionSettings 
            }; 

    _writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings]; 
    _writerInput.expectsMediaDataInRealTime = YES; 

    NSError* error; 
    _assetWriter = [AVAssetWriter assetWriterWithURL:url fileType:AVFileTypeMPEG4 error:&error]; 
    // handle errors 

    _assetWriter.shouldOptimizeForNetworkUse = YES; 
    [_assetWriter addInput:_writerInput]; 
    [_videoOut setSampleBufferDelegate:self queue:_videoProcessingQueue]; 

    _isCapturing = YES; 
} 

-(void)stopCapture; 
{ 
    if (!_isCapturing) return; 

    [_videoOut setSampleBufferDelegate:nil queue:nil]; // TODO: seems like there could be a race condition between this line and the next (could end up trying to write a buffer after calling writingFinished 

    dispatch_async(_videoProcessingQueue, ^{ 
     [_assetWriter finishWritingWithCompletionHandler:^{ 
      [self writingFinished]; 
     }]; 
    }); 
} 

-(void)writingFinished; 
{ 
    // TODO: need to check _assetWriter.status to make sure everything completed successfully 
    // do whatever post processing you need here 
} 


-(void)captureOutput:(AVCaptureOutput*)captureOutput didDropSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection; 
{ 
    NSLog(@"Video frame was dropped."); 
} 

-(void)captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection 
{ 
    if(_assetWriter.status != AVAssetWriterStatusWriting) { 
     CMTime lastSampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer); 
     [_assetWriter startWriting]; // TODO: need to check the return value (a bool) 
     [_assetWriter startSessionAtSourceTime:lastSampleTime]; 
    } 

    if (!_writerInput.readyForMoreMediaData || ![_writerInput appendSampleBuffer:sampleBuffer]) { 
     NSLog(@"Failed to write video buffer to output."); 
    } 
} 

@end 
1

壓縮/伸縮的視頻,我們可以使用AVAssetExportSession。

  • 我們可以上傳一個持續時間爲3.30分鐘的視頻。
  • 如果視頻持續時間超過3.30分鐘,將顯示內存警告。
  • 由於此處我們沒有對視頻使用任何轉換,所以視頻在錄製時將保持原樣。
  • 下面是壓縮視頻的示例代碼。
  • 我們可以在壓縮前和壓縮後檢查視頻大小。

{

-(void)trimVideoWithURL:(NSURL *)inputURL{ 


NSString *path1 = [inputURL path]; 
NSData *data = [[NSFileManager defaultManager] contentsAtPath:path1]; 
NSLog(@"size before compress video is %lu",(unsigned long)data.length); 

AVURLAsset *asset = [AVURLAsset URLAssetWithURL:inputURL options:nil]; 
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPreset640x480]; 

NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); 
NSString *outputURL = paths[0]; 
NSFileManager *manager = [NSFileManager defaultManager]; 
[manager createDirectoryAtPath:outputURL withIntermediateDirectories:YES attributes:nil error:nil]; 
outputURL = [outputURL stringByAppendingPathComponent:@"output.mp4"]; 
fullPath = [NSURL URLWithString:outputURL]; 

// Remove Existing File 

[manager removeItemAtPath:outputURL error:nil]; 

exportSession.outputURL = [NSURL fileURLWithPath:outputURL]; 
exportSession.shouldOptimizeForNetworkUse = YES; 
exportSession.outputFileType = AVFileTypeQuickTimeMovie; 

CMTime start = CMTimeMakeWithSeconds(1.0, 600); 
CMTime duration = CMTimeMakeWithSeconds(1.0, 600); 
CMTimeRange range = CMTimeRangeMake(start, duration); 
exportSession.timeRange = range; 

[exportSession exportAsynchronouslyWithCompletionHandler:^(void) 
{ 
    switch (exportSession.status) { 

     case AVAssetExportSessionStatusCompleted:{ 

      NSString *path = [fullPath path]; 
      NSData *data = [[NSFileManager defaultManager] contentsAtPath:path]; 
      NSLog(@"size after compress video is %lu",(unsigned long)data.length); 
      NSLog(@"Export Complete %d %@", exportSession.status, exportSession.error); 
      /* 
       Do your neccessay stuff here after compression 
       */ 

     } 
      break; 
     case AVAssetExportSessionStatusFailed: 
      NSLog(@"Failed:%@",exportSession.error); 
      break; 
     case AVAssetExportSessionStatusCancelled: 
      NSLog(@"Canceled:%@",exportSession.error); 
      break; 
     default: 
      break; 
    } 
}];} 
相關問題