2013-07-01 71 views
7

我已經有一些在線代碼捕獲從iPhone的相機視頻,然後將其存儲到視頻文件,它工作正常。但我的目的不是將它保存在內存中,而是將它發送給服務器。我發現有一個名爲WOWZA的免費媒體服務器,它允許流式傳輸,並且Apple還具有(HSL)HTTP實時流式傳輸功能,並且服務器期望視頻採用h.264格式,視頻採用h.264格式,音頻採用mp3。通過閱讀一些關於Apple HSL的文檔,我也瞭解到,它在播放列表文件中爲媒體文件的每個片段提供了不同的URL,然後通過瀏覽器以正確的順序在設備上播放。我不確定如何獲取手機攝像頭錄製的文件的小部分,以及如何將其轉換爲所需的格式。 以下是用於捕獲視頻的代碼:如何將從iPhone攝像頭捕獲的視頻發送到服務器進行直播?

實現文件

#import "THCaptureViewController.h" 
#import <AVFoundation/AVFoundation.h> 
#import "THPlayerViewController.h" 

#define VIDEO_FILE @"test.mov" 

@interface THCaptureViewController() 
@property (nonatomic, strong) AVCaptureSession *captureSession; 
@property (nonatomic, strong) AVCaptureMovieFileOutput *captureOutput; 
@property (nonatomic, weak) AVCaptureDeviceInput *activeVideoInput; 
@property (nonatomic, strong) AVCaptureVideoPreviewLayer *previewLayer; 
@end 

@implementation THCaptureViewController 

- (void)viewDidLoad 
{ 
[super viewDidLoad]; 

    #if TARGET_IPHONE_SIMULATOR 
    self.simulatorView.hidden = NO; 
     [self.view bringSubviewToFront:self.simulatorView]; 
    #else 
    self.simulatorView.hidden = YES; 
    [self.view sendSubviewToBack:self.simulatorView]; 
    #endif 

// Hide the toggle button if device has less than 2 cameras. Does 3GS support iOS 6? 
self.toggleCameraButton.hidden = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo] count] < 2; 

dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), 
    ^{ 
    [self setUpCaptureSession]; 
}); 
} 

#pragma mark - Configure Capture Session 

- (void)setUpCaptureSession 
{ 
self.captureSession = [[AVCaptureSession alloc] init]; 


NSError *error; 

// Set up hardware devices 
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; 
if (videoDevice) { 
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error]; 
    if (input) { 
     [self.captureSession addInput:input]; 
     self.activeVideoInput = input; 
    } 
} 
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio]; 
if (audioDevice) { 
    AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error]; 
    if (audioInput) { 
     [self.captureSession addInput:audioInput]; 
    } 
} 

//Create a VideoDataOutput and add it to the session 
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init]; 
[self.captureSession addOutput:output]; 

// Setup the still image file output 
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init]; 
[stillImageOutput setOutputSettings:@{AVVideoCodecKey : AVVideoCodecJPEG}]; 

if ([self.captureSession canAddOutput:stillImageOutput]) { 
    [self.captureSession addOutput:stillImageOutput]; 
} 

// Start running session so preview is available 
[self.captureSession startRunning]; 

// Set up preview layer 
dispatch_async(dispatch_get_main_queue(), ^{ 
    self.previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession]; 
    self.previewLayer.frame = self.previewView.bounds; 
    self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; 

    [[self.previewLayer connection] setVideoOrientation:[self currentVideoOrientation]]; 
    [self.previewView.layer addSublayer:self.previewLayer]; 
}); 

} 

#pragma mark - Start Recording 

- (IBAction)startRecording:(id)sender { 

if ([sender isSelected]) { 
    [sender setSelected:NO]; 
    [self.captureOutput stopRecording]; 

} else { 
    [sender setSelected:YES]; 

    if (!self.captureOutput) { 
     self.captureOutput = [[AVCaptureMovieFileOutput alloc] init]; 
     [self.captureSession addOutput:self.captureOutput]; 
    } 

    // Delete the old movie file if it exists 
    //[[NSFileManager defaultManager] removeItemAtURL:[self outputURL] error:nil]; 

    [self.captureSession startRunning]; 

    AVCaptureConnection *videoConnection = [self connectionWithMediaType:AVMediaTypeVideo fromConnections:self.captureOutput.connections]; 

    if ([videoConnection isVideoOrientationSupported]) { 
     videoConnection.videoOrientation = [self currentVideoOrientation]; 
    } 

    if ([videoConnection isVideoStabilizationSupported]) { 
     videoConnection.enablesVideoStabilizationWhenAvailable = YES; 
    } 

    [self.captureOutput startRecordingToOutputFileURL:[self outputURL] recordingDelegate:self]; 
} 

// Disable the toggle button if recording 
self.toggleCameraButton.enabled = ![sender isSelected]; 
} 

- (AVCaptureConnection *)connectionWithMediaType:(NSString *)mediaType fromConnections:(NSArray *)connections { 
for (AVCaptureConnection *connection in connections) { 
    for (AVCaptureInputPort *port in [connection inputPorts]) { 
     if ([[port mediaType] isEqual:mediaType]) { 
      return connection; 
     } 
    } 
} 
return nil; 
} 

#pragma mark - AVCaptureFileOutputRecordingDelegate 

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error { 
if (!error) { 
    [self presentRecording]; 
} else { 
    NSLog(@"Error: %@", [error localizedDescription]); 
} 
} 

#pragma mark - Show Last Recording 

- (void)presentRecording 
{ 
    NSString *tracksKey = @"tracks"; 
    AVAsset *asset = [AVURLAsset assetWithURL:[self outputURL]]; 
    [asset loadValuesAsynchronouslyForKeys:@[tracksKey] completionHandler:^{ 
    NSError *error; 
      AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error]; 
      if (status == AVKeyValueStatusLoaded) { 
     dispatch_async(dispatch_get_main_queue(), ^{ 
      UIStoryboard *mainStoryboard = [UIStoryboard storyboardWithName:@"MainStoryboard" bundle:nil]; 
          THPlayerViewController *controller = [mainStoryboard instantiateViewControllerWithIdentifier:@"THPlayerViewController"]; 
          controller.title = @"Capture Recording"; 
          controller.asset = asset; 
          [self presentViewController:controller animated:YES completion:nil]; 
        }); 
      } 
    }]; 
} 

#pragma mark - Recoding Destination URL 

- (NSURL *)outputURL 
{ 
    NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0]; 
    NSLog(@"documents Directory: %@", documentsDirectory); 
    NSString *filePath = [documentsDirectory stringByAppendingPathComponent:VIDEO_FILE]; 

    NSLog(@"output url: %@", filePath); 
    return [NSURL fileURLWithPath:filePath]; 
} 

@end 

我發現這個link它展示瞭如何捕捉幀的視頻。但我不確定如果以幀爲單位捕獲視頻將幫助我將h.264格式的視頻發送到服務器。可以這樣做,如果是的話,那麼如何?

Here有人問這個問題的人說(他在問題的下面的評論中)說他能夠成功地做到這一點,但他沒有提到他是如何捕捉視頻的。

請告訴我該數據類型應該被用來獲得捕獲的視頻小片段,以及如何捕捉到的數據轉換成所需的格式,並將其發送到服務器。

+0

請參考這個網址它會對你有所幫助。 http://stackoverflow.com/questions/15518925/how-to-save-video-in-documents-folder-then-upload-to-server – saravanan

+0

我猜他是要求現場直播不節能視頻文件目錄中,然後發送到服務器。 – Leena

+0

是的。我想直播我的視頻。 –

回答

0

您可以使用live sdk。您必須設置nginx流媒體服務器。 請按照此鏈接。我已經使用它,它是非常有效的解決方案。 https://github.com/ltebean/Live

相關問題