2014-11-14 54 views
8

我捕捉使用的UIImagePickerController視頻影像顯示方向,我可以使用下面的代碼裁剪視頻,廣場種植和修復iOS中

AVAsset *asset = [AVAsset assetWithURL:url]; 

//create an avassetrack with our asset 
AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; 

//create a video composition and preset some settings 
AVMutableVideoComposition* videoComposition = [AVMutableVideoComposition videoComposition]; 
videoComposition.frameDuration = CMTimeMake(1, 30); 
//here we are setting its render size to its height x height (Square) 

videoComposition.renderSize = CGSizeMake(clipVideoTrack.naturalSize.height, clipVideoTrack.naturalSize.height); 

//create a video instruction 
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; 
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(60, 30)); 

AVMutableVideoCompositionLayerInstruction* transformer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack]; 

//Here we shift the viewing square up to the TOP of the video so we only see the top 
CGAffineTransform t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.height, -20); 

//Use this code if you want the viewing square to be in the middle of the video 
//CGAffineTransform t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.height, -(clipVideoTrack.naturalSize.width - clipVideoTrack.naturalSize.height) /2); 

//Make sure the square is portrait 
CGAffineTransform t2 = CGAffineTransformRotate(t1, M_PI_2); 

CGAffineTransform finalTransform = t2; 
[transformer setTransform:finalTransform atTime:kCMTimeZero]; 

//add the transformer layer instructions, then add to video composition 
instruction.layerInstructions = [NSArray arrayWithObject:transformer]; 
videoComposition.instructions = [NSArray arrayWithObject: instruction]; 

//Create an Export Path to store the cropped video 
NSString *outputPath = [NSString stringWithFormat:@"%@%@", NSTemporaryDirectory(), @"video.mp4"]; 
NSURL *exportUrl = [NSURL fileURLWithPath:outputPath]; 

//Remove any prevouis videos at that path 
[[NSFileManager defaultManager] removeItemAtURL:exportUrl error:nil]; 

//Export  
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetLowQuality] ; 
exporter.videoComposition = videoComposition; 
exporter.outputURL = exportUrl; 
exporter.outputFileType = AVFileTypeMPEG4; 

[exporter exportAsynchronouslyWithCompletionHandler:^ 
{ 
    dispatch_async(dispatch_get_main_queue(), ^{ 
     //Call when finished 
     [self exportDidFinish:exporter]; 
    }); 
}]; 

但我不知道如何解決的方向問題。就像instagram和vine應用程序一樣,即使我在橫向模式下捕捉視頻,它也應該處於縱向模式,並且需要將視頻裁剪爲正方形。請給我的解決方案...我這個問題掙扎......

+0

哎,我使用的代碼中的答案,得到了我的莊稼工作,但我看到周圍的底部這個奇怪的綠色輪廓和右側,你有過這樣的問題嗎? – iqueqiorio 2015-04-07 15:57:09

回答

21

我想的源代碼來自此鏈接(項目代碼包含)

http://www.one-dreamer.com/cropping-video-square-like-vine-instagram-xcode/

你需要先知道REAL視頻定位:

- (UIImageOrientation)getVideoOrientationFromAsset:(AVAsset *)asset 
{ 
    AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; 
    CGSize size = [videoTrack naturalSize]; 
    CGAffineTransform txf = [videoTrack preferredTransform]; 

    if (size.width == txf.tx && size.height == txf.ty) 
     return UIImageOrientationLeft; //return UIInterfaceOrientationLandscapeLeft; 
    else if (txf.tx == 0 && txf.ty == 0) 
     return UIImageOrientationRight; //return UIInterfaceOrientationLandscapeRight; 
    else if (txf.tx == 0 && txf.ty == size.width) 
     return UIImageOrientationDown; //return UIInterfaceOrientationPortraitUpsideDown; 
    else 
     return UIImageOrientationUp; //return UIInterfaceOrientationPortrait; 
} 

我的方式做出函數,它返回正確的方向,如果它是一個圖像

然後,我修改功能以固定右方向,支持任何修剪區域不只是一個正方形,像這樣:

// apply the crop to passed video asset (set outputUrl to avoid the saving on disk). Return the exporter session object 
- (AVAssetExportSession*)applyCropToVideoWithAsset:(AVAsset*)asset AtRect:(CGRect)cropRect OnTimeRange:(CMTimeRange)cropTimeRange ExportToUrl:(NSURL*)outputUrl ExistingExportSession:(AVAssetExportSession*)exporter WithCompletion:(void(^)(BOOL success, NSError* error, NSURL* videoUrl))completion 
{ 

    //create an avassetrack with our asset 
    AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; 

    //create a video composition and preset some settings 
    AVMutableVideoComposition* videoComposition = [AVMutableVideoComposition videoComposition]; 
    videoComposition.frameDuration = CMTimeMake(1, 30); 

    CGFloat cropOffX = cropRect.origin.x; 
    CGFloat cropOffY = cropRect.origin.y; 
    CGFloat cropWidth = cropRect.size.width; 
    CGFloat cropHeight = cropRect.size.height; 

    videoComposition.renderSize = CGSizeMake(cropWidth, cropHeight); 

    //create a video instruction 
    AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; 
    instruction.timeRange = cropTimeRange; 

    AVMutableVideoCompositionLayerInstruction* transformer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack]; 

    UIImageOrientation videoOrientation = [self getVideoOrientationFromAsset:asset]; 

    CGAffineTransform t1 = CGAffineTransformIdentity; 
    CGAffineTransform t2 = CGAffineTransformIdentity; 

    switch (videoOrientation) { 
     case UIImageOrientationUp: 
      t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.height - cropOffX, 0 - cropOffY); 
      t2 = CGAffineTransformRotate(t1, M_PI_2); 
      break; 
     case UIImageOrientationDown: 
      t1 = CGAffineTransformMakeTranslation(0 - cropOffX, clipVideoTrack.naturalSize.width - cropOffY); // not fixed width is the real height in upside down 
      t2 = CGAffineTransformRotate(t1, - M_PI_2); 
      break; 
     case UIImageOrientationRight: 
      t1 = CGAffineTransformMakeTranslation(0 - cropOffX, 0 - cropOffY); 
      t2 = CGAffineTransformRotate(t1, 0); 
      break; 
     case UIImageOrientationLeft: 
      t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.width - cropOffX, clipVideoTrack.naturalSize.height - cropOffY); 
      t2 = CGAffineTransformRotate(t1, M_PI ); 
      break; 
     default: 
      NSLog(@"no supported orientation has been found in this video"); 
      break; 
    } 

    CGAffineTransform finalTransform = t2; 
    [transformer setTransform:finalTransform atTime:kCMTimeZero]; 

    //add the transformer layer instructions, then add to video composition 
    instruction.layerInstructions = [NSArray arrayWithObject:transformer]; 
    videoComposition.instructions = [NSArray arrayWithObject: instruction]; 

    //Remove any prevouis videos at that path 
    [[NSFileManager defaultManager] removeItemAtURL:outputUrl error:nil]; 

    if (!exporter){ 
     exporter = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality] ; 
    } 

    // assign all instruction for the video processing (in this case the transformation for cropping the video 
    exporter.videoComposition = videoComposition; 
    //exporter.outputFileType = AVFileTypeQuickTimeMovie; 

    if (outputUrl){ 

     exporter.outputURL = outputUrl; 
     [exporter exportAsynchronouslyWithCompletionHandler:^{ 

      switch ([exporter status]) { 
       case AVAssetExportSessionStatusFailed: 
        NSLog(@"crop Export failed: %@", [[exporter error] localizedDescription]); 
        if (completion){ 
         dispatch_async(dispatch_get_main_queue(), ^{ 
          completion(NO,[exporter error],nil); 
         }); 
         return; 
        } 
        break; 
       case AVAssetExportSessionStatusCancelled: 
        NSLog(@"crop Export canceled"); 
        if (completion){ 
         dispatch_async(dispatch_get_main_queue(), ^{ 
          completion(NO,nil,nil); 
         }); 
         return; 
        } 
        break; 
       default: 
        break; 
      } 

      if (completion){ 
       dispatch_async(dispatch_get_main_queue(), ^{ 
        completion(YES,nil,outputUrl); 
       }); 
      } 

     }]; 
    } 

    return exporter; 
} 

在所有記錄的視頻取向(向上,向下,Lanscape R,橫向L)測試普通和前置相機的情況。我測試了iPhone 5S(的iOS 8.1),iPhone 6 Plus上(的iOS 8.1)

希望它可以幫助

+0

謝謝...它的工作正常..但只有問題是,我設置cropOffY = 60 ..當在橫向模式下錄製時,視頻裁剪不正確,視頻底部也會顯示黑色區域。請讓我知道如何解決這個問題? – Surfer 2014-11-17 06:13:47

+0

我解決了上述問題..但如果我從照片庫中選擇視頻,裁剪的視頻看起來放大..請讓我們知道如何解決這個問題 – Surfer 2014-11-17 06:42:42

+0

似乎很奇怪,因爲它不會對視頻應用任何直接縮放,它只是定義了渲染區域,並且它只是旋轉和平移的方向。你可以嘗試將scale設置爲1:1,如下所示:t2 = CGAffineTransformScale(t1,1,1)then t2 = CGAffineTransformRotate(t2,M_PI_2); (這是針對開關塊中的第一種情況,對於所有情況都是如此),但我不認爲它會改變。也許可能是這樣的一個事實,即當你從選取器中拍攝一段視頻時,它會被壓縮。如果你從拾取器拍攝視頻,它將從1920x1080到1280x720的AVAsset(video.naturalSize) – 2014-11-17 16:30:02

5

我知道這個問題是舊的,但有些人可能仍然會奇怪,爲什麼一些來自攝像機的視頻在裁剪後滾動放大。我遇到了這個問題,並意識到我用作框架的cropRect並未針對視頻的不同長寬比進行縮放。爲了解決這個問題,我簡單地添加了下面的代碼,將視頻的最頂部裁剪成一個正方形。如果您想更改位置,只需更改y值,但一定要根據視頻進行縮放。 Luca Iaco提供了一些很好的代碼來開始。我很感激!

CGSize videoSize = [[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize]; 
float scaleFactor; 

if (videoSize.width > videoSize.height) { 

    scaleFactor = videoSize.height/320; 
} 
else if (videoSize.width == videoSize.height){ 

    scaleFactor = videoSize.height/320; 
} 
else{ 
    scaleFactor = videoSize.width/320; 
} 

CGFloat cropOffX = 0; 
CGFloat cropOffY = 0; 
CGFloat cropWidth = 320 *scaleFactor; 
CGFloat cropHeight = 320 *scaleFactor; 
4

這是我的代碼,用於從磁盤上的視頻創建藤狀視頻。這是寫在SWIFT:

static let MaxDuration: CMTimeValue = 12 

class func compressVideoAsset(_ asset: AVAsset, output: URL, completion: @escaping (_ data: Data?) -> Void) 
{ 
    let session = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetMediumQuality)! 
    session.videoComposition = self.squareVideoCompositionForAsset(asset) 
    session.outputURL = output 
    session.outputFileType = AVFileTypeMPEG4 
    session.shouldOptimizeForNetworkUse = true 
    session.canPerformMultiplePassesOverSourceMediaData = true 

    let duration = CMTimeValue(CGFloat(asset.duration.value)/CGFloat(asset.duration.timescale) * 30) 
    session.timeRange = CMTimeRange(start: kCMTimeZero, duration: CMTime(value: min(duration, VideoCompressor.MaxDuration * 30), timescale: 30)) 

    session.exportAsynchronously(completionHandler: {() -> Void in 
     let data = try? Data(contentsOf: output) 

     DispatchQueue.main.async(execute: {() -> Void in 
      completion(data) 
     }) 
    }) 
} 

private class func squareVideoCompositionForAsset(_ asset: AVAsset) -> AVVideoComposition 
{ 
    let track = asset.tracks(withMediaType: AVMediaTypeVideo)[0] 
    let length = min(track.naturalSize.width, track.naturalSize.height) 

    var transform = track.preferredTransform 

    let size = track.naturalSize 
    let scale: CGFloat = (transform.a == -1 && transform.b == 0 && transform.c == 0 && transform.d == -1) ? -1 : 1 // check for inversion 

    transform = transform.translatedBy(x: scale * -(size.width - length)/2, y: scale * -(size.height - length)/2) 

    let transformer = AVMutableVideoCompositionLayerInstruction(assetTrack: track) 
    transformer.setTransform(transform, at: kCMTimeZero) 

    let instruction = AVMutableVideoCompositionInstruction() 
    instruction.timeRange = CMTimeRange(start: kCMTimeZero, duration: kCMTimePositiveInfinity) 
    instruction.layerInstructions = [transformer] 

    let composition = AVMutableVideoComposition() 
    composition.frameDuration = CMTime(value: 1, timescale: 30) 
    composition.renderSize = CGSize(width: length, height: length) 
    composition.instructions = [instruction] 

    return composition 
} 
+0

請看看[我的問題](http://stackoverflow.com/questions/43558021/swift-square-video-composition) – 2017-04-22 11:58:12

+0

嗨pigeon_39。我上面編輯了我的代碼。我相信我遇到了同樣的輪換問題,這個更新的邏輯應該修復它。 – 2017-04-22 19:18:40

+0

謝謝。它非常適用於風景視頻,但它不適用於縱向視頻。你能幫我一點嗎?這是我[更新的問題](http://stackoverflow.com/questions/43558021/swift-square-video-composition) – 2017-04-23 08:17:41