評論:
截至目前,我通過 縮減&發送所述像素緩衝器等設備爲JPEG壓縮數據。在遠程設備上,I 具有UIImage設置,我在每個幀時間顯示數據。然而 我認爲UIKit可能不是顯示數據的最佳方式,儘管 圖像很小。
原來,這是通過Multipeer Connectivity框架傳輸圖像的最佳方式。我試過所有的替代品:
- 我使用VideoToolbox壓縮幀。太慢了。
- 我使用壓縮壓縮幀。太慢,但更好。
讓我提供#2中的一些代碼:
在iOS裝置發送圖像數據:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer,0);
__block uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
dispatch_async(self.compressionQueue, ^{
uint8_t *compressed = malloc(sizeof(uint8_t) * 1228808);
size_t compressedSize = compression_encode_buffer(compressed, 1228808, baseAddress, 1228808, NULL, COMPRESSION_ZLIB);
NSData *data = [NSData dataWithBytes:compressed length:compressedSize];
NSLog(@"Sending size: %lu", [data length]);
dispatch_async(dispatch_get_main_queue(), ^{
__autoreleasing NSError *err;
[((ViewController *)self.parentViewController).session sendData:data toPeers:((ViewController *)self.parentViewController).session.connectedPeers withMode:MCSessionSendDataReliable error:&err];
});
});
CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
}
在iOS裝置顯示圖像數據:
typedef struct {
size_t length;
void *data;
} ImageCacheDataStruct;
- (void)session:(nonnull MCSession *)session didReceiveData:(nonnull NSData *)data fromPeer:(nonnull MCPeerID *)peerID
{
NSLog(@"Receiving size: %lu", [data length]);
uint8_t *original = malloc(sizeof(uint8_t) * 1228808);
size_t originalSize = compression_decode_buffer(original, 1228808, [data bytes], [data length], NULL, COMPRESSION_ZLIB);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef newContext = CGBitmapContextCreate(original, 640, 480, 8, 2560, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGImageRef newImage = CGBitmapContextCreateImage(newContext);
UIImage *image = [[UIImage alloc] initWithCGImage:newImage scale:1 orientation:UIImageOrientationUp];
CGContextRelease(newContext);
CGColorSpaceRelease(colorSpace);
CGImageRelease(newImage);
if (image) {
dispatch_async(dispatch_get_main_queue(), ^{
[((ViewerViewController *)self.childViewControllers.lastObject).view.layer setContents:(__bridge id)image.CGImage];
});
}
}
雖然此代碼在接收端生成原始質量的圖像,您會發現實時播放速度太慢。
下面是做到這一點的最好辦法:
在iOS設備發送圖像數據:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer,0);
uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGImageRef newImage = CGBitmapContextCreateImage(newContext);
UIImage *image = [[UIImage alloc] initWithCGImage:newImage scale:1 orientation:UIImageOrientationUp];
CGImageRelease(newImage);
CGContextRelease(newContext);
CGColorSpaceRelease(colorSpace);
CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
if (image) {
NSData *data = UIImageJPEGRepresentation(image, 0.7);
NSError *err;
[((ViewController *)self.parentViewController).session sendData:data toPeers:((ViewController *)self.parentViewController).session.connectedPeers withMode:MCSessionSendDataReliable error:&err];
}
}
在iOS設備上接收圖像數據:
- (void)session:(nonnull MCSession *)session didReceiveData:(nonnull NSData *)data fromPeer:(nonnull MCPeerID *)peerID
{
dispatch_async(self.imageCacheDataQueue, ^{
dispatch_semaphore_wait(self.semaphore, DISPATCH_TIME_FOREVER);
const void *dataBuffer = [data bytes];
size_t dataLength = [data length];
ImageCacheDataStruct *imageCacheDataStruct = calloc(1, sizeof(imageCacheDataStruct));
imageCacheDataStruct->data = (void*)dataBuffer;
imageCacheDataStruct->length = dataLength;
__block const void * kMyKey;
dispatch_queue_set_specific(self.imageDisplayQueue, &kMyKey, (void *)imageCacheDataStruct, NULL);
dispatch_sync(self.imageDisplayQueue, ^{
ImageCacheDataStruct *imageCacheDataStruct = calloc(1, sizeof(imageCacheDataStruct));
imageCacheDataStruct = dispatch_queue_get_specific(self.imageDisplayQueue, &kMyKey);
const void *dataBytes = imageCacheDataStruct->data;
size_t length = imageCacheDataStruct->length;
NSData *imageData = [NSData dataWithBytes:dataBytes length:length];
UIImage *image = [UIImage imageWithData:imageData];
if (image) {
dispatch_async(dispatch_get_main_queue(), ^{
[((ViewerViewController *)self.childViewControllers.lastObject).view.layer setContents:(__bridge id)image.CGImage];
dispatch_semaphore_signal(self.semaphore);
});
}
});
});
}
原因信號量和單獨的GCD隊列很簡單:您希望幀以相同的時間間隔顯示。否則,視頻開始有時會放慢速度,在加速超過正常速度以趕上之前。我的方案確保每個幀都以相同的速度連續播放,而不管網絡帶寬瓶頸。
AFAIK多路連接器具有滿足您對發現和流式直播會話要求的所有API。如果你不想使用它,其他選項將使用AirPlay,但它是一種解決方法,並且無法幫助發現客戶端附近的iPhone – Bluewings
我嘗試了MC的示例代碼,但無法配對設備併發送數據。我需要的是一個簡單的界面,其中一個iPhone拋出一個彈出窗口,一旦發現它就選擇對等點。選擇對等點啓動數據流。我無法通過任何可用的示例代碼使其工作,其中包括Apple on MC羣聊中的代碼。 –