6

如何在iOS 7中使用藍牙或WiFi將攝像機饋送從一臺iOS設備有效傳輸到另一臺設備。以下是獲取蒸汽緩衝液的代碼。如何使用多點對等連接將攝像頭從一臺iOS設備傳輸到另一臺設備

- (void)captureOutput:(AVCaptureOutput *)captureOutput 
     didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
     fromConnection:(AVCaptureConnection *)connection 
{ 
    // Create a UIImage from the sample buffer data 
    UIImage *image = [self imageFromSampleBuffer:sampleBuffer]; 


} 

    // Create a UIImage from sample buffer data 
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer 
{ 
    // Get a CMSampleBuffer's Core Video image buffer for the media data 
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
    // Lock the base address of the pixel buffer 
    CVPixelBufferLockBaseAddress(imageBuffer, 0); 

    // Get the number of bytes per row for the pixel buffer 
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); 

    // Get the number of bytes per row for the pixel buffer 
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    // Get the pixel buffer width and height 
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer); 

    // Create a device-dependent RGB color space 
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 

    // Create a bitmap graphics context with the sample buffer data 
    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, 
     bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
    // Create a Quartz image from the pixel data in the bitmap graphics context 
    CGImageRef quartzImage = CGBitmapContextCreateImage(context); 
    // Unlock the pixel buffer 
    CVPixelBufferUnlockBaseAddress(imageBuffer,0); 

    // Free up the context and color space 
    CGContextRelease(context); 
    CGColorSpaceRelease(colorSpace); 

    // Create an image object from the Quartz image 
    UIImage *image = [UIImage imageWithCGImage:quartzImage]; 

    // Release the Quartz image 
    CGImageRelease(quartzImage); 

    return (image); 
} 

這裏我們可以得到iOS攝像頭捕獲的圖像。

我們可以將樣品緩衝區信息直接發送到使用多個對等設備的其他設備,還是有任何有效的方法將數據傳送到其他iOS設備?

謝謝。

+0

多路連接聽起來像是一個有效的選擇。但是你需要檢查性能。發送未壓縮的圖像可能需要太多帶寬,因此您可能必須創建一個真正的視頻流才能傳輸實時捕獲。 – allprog 2014-09-15 11:18:56

+0

編輯必須是6個字符,所以除非我們拿出填充,這篇文章將永遠蒸汽飼料 – 2015-08-18 19:08:21

+0

好問題Sandipbhai,Upvoted .. – NSPratik 2016-01-12 13:37:25

回答

0

我得到了這樣做的方式,我們可以使用多點對點連接來流式傳輸壓縮圖像,使其看起來像流式攝像機。

一個對誰去送流將使用此code.In captureOutput委託方法:

 NSData *imageData = UIImageJPEGRepresentation(cgBackedImage, 0.2); 

    // maybe not always the correct input? just using this to send current FPS... 
    AVCaptureInputPort* inputPort = connection.inputPorts[0]; 
    AVCaptureDeviceInput* deviceInput = (AVCaptureDeviceInput*) inputPort.input; 
    CMTime frameDuration = deviceInput.device.activeVideoMaxFrameDuration; 
    NSDictionary* dict = @{ 
          @"image": imageData, 
          @"timestamp" : timestamp, 
          @"framesPerSecond": @(frameDuration.timescale) 
          }; 
    NSData *data = [NSKeyedArchiver archivedDataWithRootObject:dict]; 


    [_session sendData:data toPeers:_session.connectedPeers withMode:MCSessionSendDataReliable error:nil]; 

,並在接收端:

- (void)session:(MCSession *)session didReceiveData:(NSData *)data fromPeer:(MCPeerID *)peerID { 

// NSLog(@"(%@) Read %d bytes", peerID.displayName, data.length); 

    NSDictionary* dict = (NSDictionary*) [NSKeyedUnarchiver unarchiveObjectWithData:data]; 
    UIImage* image = [UIImage imageWithData:dict[@"image"] scale:2.0]; 
    NSNumber* framesPerSecond = dict[@"framesPerSecond"]; 


} 

我們將獲得FPS值,並相應地我們可以設置參數來管理我們的流媒體圖像。

希望它會有所幫助。

謝謝。

+0

桑德你是否嘗試使用多點同步在兩個iPhone設備之間流式傳輸音頻文件? – 2016-05-05 10:16:56

0

下面是做到這一點的最好方法(和我解釋爲什麼在結尾):

在iOS設備發送圖像數據:接收圖像數據

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection 
{ 
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 

    CVPixelBufferLockBaseAddress(imageBuffer,0); 
    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer); 
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer); 

    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 
    CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
    CGImageRef newImage = CGBitmapContextCreateImage(newContext); 


    UIImage *image = [[UIImage alloc] initWithCGImage:newImage scale:1 orientation:UIImageOrientationUp]; 
    CGImageRelease(newImage); 
    CGContextRelease(newContext); 
    CGColorSpaceRelease(colorSpace); 
    CVPixelBufferUnlockBaseAddress(imageBuffer, 0); 

    if (image) { 
     NSData *data = UIImageJPEGRepresentation(image, 0.7); 
     NSError *err; 
     [((ViewController *)self.parentViewController).session sendData:data toPeers:((ViewController *)self.parentViewController).session.connectedPeers withMode:MCSessionSendDataReliable error:&err]; 
    } 
} 

在iOS設備:

typedef struct { 
    size_t length; 
    void *data; 
} ImageCacheDataStruct; 

- (void)session:(nonnull MCSession *)session didReceiveData:(nonnull NSData *)data fromPeer:(nonnull MCPeerID *)peerID 
{ 
    dispatch_async(self.imageCacheDataQueue, ^{ 
     dispatch_semaphore_wait(self.semaphore, DISPATCH_TIME_FOREVER); 
     const void *dataBuffer = [data bytes]; 
     size_t dataLength = [data length]; 
     ImageCacheDataStruct *imageCacheDataStruct = calloc(1, sizeof(imageCacheDataStruct)); 
     imageCacheDataStruct->data = (void*)dataBuffer; 
     imageCacheDataStruct->length = dataLength; 

     __block const void * kMyKey; 
     dispatch_queue_set_specific(self.imageDisplayQueue, &kMyKey, (void *)imageCacheDataStruct, NULL); 

     dispatch_sync(self.imageDisplayQueue, ^{ 
      ImageCacheDataStruct *imageCacheDataStruct = calloc(1, sizeof(imageCacheDataStruct)); 
      imageCacheDataStruct = dispatch_queue_get_specific(self.imageDisplayQueue, &kMyKey); 
      const void *dataBytes = imageCacheDataStruct->data; 
      size_t length = imageCacheDataStruct->length; 
      NSData *imageData = [NSData dataWithBytes:dataBytes length:length]; 
      UIImage *image = [UIImage imageWithData:imageData]; 
      if (image) { 
       dispatch_async(dispatch_get_main_queue(), ^{ 
        [((ViewerViewController *)self.childViewControllers.lastObject).view.layer setContents:(__bridge id)image.CGImage]; 
        dispatch_semaphore_signal(self.semaphore); 
       }); 
      } 
     }); 
    }); 
} 

信號量和單獨的GCD隊列的原因很簡單:您希望幀以相等的時間間隔顯示。否則,視頻開始有時會放慢速度,在加速超過正常速度以趕上之前。我的方案確保每個幀都以相同的速度連續播放,而不管網絡帶寬瓶頸。

相關問題