2015-04-04 64 views
5

這裏是鏈接github https://github.com/spennyf/cropVid/tree/master來試試你的自我,看看我在說什麼,它將需要1分鐘的時間來測試。謝謝!裁剪區域與iOS中的選定區域不同?

我正在拍攝帶有方形的視頻以顯示vid的哪部分將被裁剪。就像這樣:

enter image description here

現在我做了一張紙,4號線在廣場上的這一點,並在頂部和底部半年線的差異。然後,我使用的代碼,我將發佈作物的視頻,但後來當我顯示視頻我看到這個(忽略背景和綠色圓圈):

enter image description here

正如你可以看到有超過四線,所以我將它設置爲裁剪某個部分,但是當我使用相機中顯示的同一個矩形以及用於裁剪的同一個矩形時,它將添加更多內容。

所以我的問題是爲什麼裁剪不一樣的大小?

這裏是我做的產量和顯示:

//this is the square on the camera 
UIView *view = [[UIView alloc] initWithFrame:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height-80)]; 
    UIImageView *image = [[UIImageView alloc] init]; 
    image.layer.borderColor=[[UIColor whiteColor] CGColor]; 
image.frame = CGRectMake(self.view.frame.size.width/2 - 58 , 100 , 116, 116); 
    CALayer *imageLayer = image.layer; 
    [imageLayer setBorderWidth:1]; 
[view addSubview:image]; 
    [picker setCameraOverlayView:view]; 

//this is crop rect 
CGRect rect = CGRectMake(self.view.frame.size.width/2 - 58, 100, 116, 116); 
[self applyCropToVideoWithAsset:assest AtRect:rect OnTimeRange:CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(assest.duration.value, 1)) 
        ExportToUrl:exportUrl ExistingExportSession:exporter WithCompletion:^(BOOL success, NSError *error, NSURL *videoUrl) { 
//here is player 
AVPlayer *player = [AVPlayer playerWithURL:videoUrl]; 

          AVPlayerLayer *layer = [AVPlayerLayer playerLayerWithPlayer:player]; 
layer.frame = CGRectMake(self.view.frame.size.width/2 - 58, 100, 116, 116); 
}]; 

這裏是代碼,不會對作物:

- (UIImageOrientation)getVideoOrientationFromAsset:(AVAsset *)asset 
{ 
    AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; 
CGSize size = [videoTrack naturalSize]; 
CGAffineTransform txf = [videoTrack preferredTransform]; 

if (size.width == txf.tx && size.height == txf.ty) 
    return UIImageOrientationLeft; //return UIInterfaceOrientationLandscapeLeft; 
else if (txf.tx == 0 && txf.ty == 0) 
    return UIImageOrientationRight; //return UIInterfaceOrientationLandscapeRight; 
else if (txf.tx == 0 && txf.ty == size.width) 
    return UIImageOrientationDown; //return UIInterfaceOrientationPortraitUpsideDown; 
else 
    return UIImageOrientationUp; //return UIInterfaceOrientationPortrait; 
} 

這裏是種植代碼的其餘部分:

- (AVAssetExportSession*)applyCropToVideoWithAsset:(AVAsset*)asset AtRect:(CGRect)cropRect OnTimeRange:(CMTimeRange)cropTimeRange ExportToUrl:(NSURL*)outputUrl ExistingExportSession:(AVAssetExportSession*)exporter WithCompletion:(void(^)(BOOL success, NSError* error, NSURL* videoUrl))completion 
{ 

// NSLog(@"CALLED"); 
//create an avassetrack with our asset 
AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; 

//create a video composition and preset some settings 
AVMutableVideoComposition* videoComposition = [AVMutableVideoComposition videoComposition]; 
videoComposition.frameDuration = CMTimeMake(1, 30); 

CGFloat cropOffX = cropRect.origin.x; 
CGFloat cropOffY = cropRect.origin.y; 
CGFloat cropWidth = cropRect.size.width; 
CGFloat cropHeight = cropRect.size.height; 
// NSLog(@"width: %f - height: %f - x: %f - y: %f", cropWidth, cropHeight, cropOffX, cropOffY); 

videoComposition.renderSize = CGSizeMake(cropWidth, cropHeight); 

//create a video instruction 
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; 
instruction.timeRange = cropTimeRange; 

AVMutableVideoCompositionLayerInstruction* transformer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack]; 

UIImageOrientation videoOrientation = [self getVideoOrientationFromAsset:asset]; 

CGAffineTransform t1 = CGAffineTransformIdentity; 
CGAffineTransform t2 = CGAffineTransformIdentity; 

switch (videoOrientation) { 
    case UIImageOrientationUp: 
     t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.height - cropOffX, 0 - cropOffY); 
     t2 = CGAffineTransformRotate(t1, M_PI_2); 
     break; 
    case UIImageOrientationDown: 
     t1 = CGAffineTransformMakeTranslation(0 - cropOffX, clipVideoTrack.naturalSize.width - cropOffY); // not fixed width is the real height in upside down 
     t2 = CGAffineTransformRotate(t1, - M_PI_2); 
     break; 
    case UIImageOrientationRight: 
     t1 = CGAffineTransformMakeTranslation(0 - cropOffX, 0 - cropOffY); 
     t2 = CGAffineTransformRotate(t1, 0); 
     break; 
    case UIImageOrientationLeft: 
     t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.width - cropOffX, clipVideoTrack.naturalSize.height - cropOffY); 
     t2 = CGAffineTransformRotate(t1, M_PI ); 
     break; 
    default: 
     NSLog(@"no supported orientation has been found in this video"); 
     break; 
} 

CGAffineTransform finalTransform = t2; 
[transformer setTransform:finalTransform atTime:kCMTimeZero]; 

//add the transformer layer instructions, then add to video composition 
instruction.layerInstructions = [NSArray arrayWithObject:transformer]; 
videoComposition.instructions = [NSArray arrayWithObject: instruction]; 

//Remove any prevouis videos at that path 
[[NSFileManager defaultManager] removeItemAtURL:outputUrl error:nil]; 

if (!exporter){ 
    exporter = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality] ; 
} 
// assign all instruction for the video processing (in this case the transformation for cropping the video 
exporter.videoComposition = videoComposition; 
exporter.outputFileType = AVFileTypeQuickTimeMovie; 
if (outputUrl){ 
    exporter.outputURL = outputUrl; 
    [exporter exportAsynchronouslyWithCompletionHandler:^{ 
     switch ([exporter status]) { 
      case AVAssetExportSessionStatusFailed: 
       NSLog(@"crop Export failed: %@", [[exporter error] localizedDescription]); 
       if (completion){ 
        dispatch_async(dispatch_get_main_queue(), ^{ 
         completion(NO,[exporter error],nil); 
        }); 
        return; 
       } 
       break; 
      case AVAssetExportSessionStatusCancelled: 
       NSLog(@"crop Export canceled"); 
       if (completion){ 
        dispatch_async(dispatch_get_main_queue(), ^{ 
         completion(NO,nil,nil); 
        }); 
        return; 
       } 
       break; 
      default: 
       break; 
     } 
     if (completion){ 
      dispatch_async(dispatch_get_main_queue(), ^{ 
       completion(YES,nil,outputUrl); 
      }); 
     } 

    }]; 
} 

return exporter; 
} 

所以我的問題是爲什麼視頻區域不同於作物/相機區域,當我使用完全相同的座標和s正方形?

+0

可以肯定的是,一旦裁剪的視頻被執行(所以在完成塊中)應該被保存在iphone磁盤上。請直接檢查該文件,我的意思是訪問該文件(將iPhone連接到Mac,並使用iExplorer或iFunBox等工具)。然後在Mac上覆制它,並用默認的Mac快速播放器打開它。通過這種方式,您可以確保所得裁剪視頻與您在該廣場中看到的完全一致。此外,請確保裁剪區域使用適當的座標來引用視圖,對於x軸和y軸 – 2015-04-09 08:28:14

+0

@LucaIaco好的我正在使用iExplorer並將視頻放到我的mac上,並且快速播放它,裁剪區域仍然是不正確。我一次又一次地查看座標,我相信他們是正確的。我將在鏈接上發佈一個git hub項目,所以如果你不介意的話,你可以下載並運行並自己查看。現在我拍攝一個綠色廣場的視頻,只是裁剪部分的廣場,但是當我裁剪時我看到了白色。我真的很感激,如果你看項目 – iqueqiorio 2015-04-09 18:01:06

+0

這裏是正確的鏈接https://github.com/spennyf/cropVid – iqueqiorio 2015-04-10 04:44:28

回答

-2

也許Check This Previous Question

它看起來可能與您所遇到的類似。對這個問題的用戶提示裁剪這種方式:

CGImageRef imageRef = CGImageCreateWithImageInRect([originalImage CGImage], cropRect); 
UIImage *croppedImage = [UIImage imageWithCGImage:imageRef]; 
CGImageRelease(imageRef); 

我希望這會有所幫助,或者至少讓你在正確的方向開始。

+0

這個答案是完全不相關的。問題是關於裁剪視頻而不是圖像。 – bgfriend0 2015-04-15 21:09:18

+0

對此深感抱歉,我起來時已經很遲,肯定誤讀這個!感謝您的高舉。 – Kleigh 2015-04-16 01:35:27

+0

哈哈,np,發生在我們所有人身上。 – bgfriend0 2015-04-16 01:35:56