2017-05-16 39 views
1

我正在做一個應用程序。我希望拍攝現場照片並將其展示給用戶進行檢查。我使用下面的代碼捕捉了現場照片。我一旦捕捉到實時照片就得到了輸出url,現在我想在PHLivePhotoView中顯示那張實況照片,我們如何使用輸出url來做到這一點。我也收到實時照片數據,請參閱下面的委託方法。如何預覽Livephoto

- (void)viewDidLoad { 
    [super viewDidLoad]; 
    //Capture Session 
    AVCaptureSession *session = [[AVCaptureSession alloc]init]; 
    session.sessionPreset = AVCaptureSessionPresetPhoto; 

    //Add device 
    AVCaptureDevice *device = 
    [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; 

    //Input 
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil]; 

    if (!input) 
    { 
     NSLog(@"No Input"); 
    } 

    [session addInput:input]; 

    //Output 
    // AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init]; 
    // [session addOutput:output]; 
    // output.videoSettings = 
    // @{ (NSString *)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) }; 

    AVCapturePhotoOutput *output =[[AVCapturePhotoOutput alloc]init]; 
    [session addOutput:output]; 
    output.livePhotoCaptureEnabled = true; 
    output.highResolutionCaptureEnabled = YES; 
    //Preview Layer 
    previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session]; 
    UIView *myView = self.previewView; 
    previewLayer.frame = myView.bounds; 
    previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; 
    [self.previewView.layer addSublayer:previewLayer]; 

    //Start capture session 
    [session startRunning]; 
    [session commitConfiguration]; 
    captureOutput = [[AVCapturePhotoOutput alloc]init]; 
    captureOutput = output; 

    // Do any additional setup after loading the view, typically from a nib. 
} 

- (IBAction)captureImage:(id)sender 
{ 
    AVCapturePhotoSettings * settings = [AVCapturePhotoSettings photoSettings]; 
    settings.highResolutionPhotoEnabled = YES; 
    settings.flashMode = AVCaptureFlashModeOn; 
    NSString *livePhotoMovieFileName = [NSUUID UUID].UUIDString; 
    NSString *livePhotoMovieFilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:[livePhotoMovieFileName stringByAppendingPathExtension:@"mov"]]; 
    settings.livePhotoMovieFileURL = [NSURL fileURLWithPath:livePhotoMovieFilePath]; 
    [captureOutput capturePhotoWithSettings:settings delegate:self]; 
} 

- (void)captureOutput:(AVCapturePhotoOutput *)captureOutput didFinishProcessingLivePhotoToMovieFileAtURL:(NSURL *)outputFileURL duration:(CMTime)duration photoDisplayTime:(CMTime)photoDisplayTime resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings error:(nullable NSError *)error 
    { 
     NSLog(@"%@",outputFileURL); 
    } 

- (void)captureOutput:(AVCapturePhotoOutput *)captureOutput didFinishProcessingPhotoSampleBuffer:(nullable CMSampleBufferRef)photoSampleBuffer previewPhotoSampleBuffer:(nullable CMSampleBufferRef)previewPhotoSampleBuffer resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings bracketSettings:(nullable AVCaptureBracketedStillImageSettings *)bracketSettings error:(nullable NSError *)error 
{ 
    photoData = [AVCapturePhotoOutput JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer previewPhotoSampleBuffer:previewPhotoSampleBuffer]; 
    NSLog(@"%@",photoData); 
} 
+0

相關:[創建蘋果公司從JPEG和MOV現場組圖](https://github.com/mzp/ LoveLiver) – tomfriwel

回答

0

可以使用

PHLivePhoto.requestLivePhotoWithResourceFileURLs 

或U可以使用的WebView太

+0

需要兩個網址作爲參數傳遞給這個方法,在那裏我們可以得到它們 –

+0

檢查這個網址我找到了它https://milen.me/writings/live-photo-ios-api-overview/ – vaibby

+0

這是當我們嘗試保存實時照片時很有幫助,但在PHLivePhotoView中播放實景照片時,我們需要以兩個網址作爲參數傳遞。 –