我正在成功發送一個NSData流。下面的委託方法獲取該流並附加到NSMutableData self.data。我如何獲取這些數據並將其製作成UIView/AVCaptureVideoPreviewLayer(應該顯示視頻)?我覺得我錯過了另一個轉換,AVCaptureSession> NSStream> MCSession> NSStream>?將傳入的NSStream轉換爲視圖
- (void)stream:(NSStream *)stream handleEvent:(NSStreamEvent)eventCode {
switch(eventCode) {
case NSStreamEventHasBytesAvailable:
{
if(!self.data) {
self.data = [NSMutableData data];
}
uint8_t buf[1024];
unsigned int len = 0;
len = [(NSInputStream *)stream read:buf maxLength:1024];
if(len) {
[self.data appendBytes:(const void *)buf length:len];
} else {
NSLog(@"no buffer!");
}
// Code here to take self.data and convert the NSData to UIView/Video
}
我這個送流:
-(void) captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer,0);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
// size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
void *src_buff = CVPixelBufferGetBaseAddress(imageBuffer);
NSData *data = [NSData dataWithBytes:src_buff length:bytesPerRow * height];
CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
NSError *error;
self.oStream = [self.mySession startStreamWithName:@"videoOut" toPeer:[[self.mySession connectedPeers]objectAtIndex:0] error:&error];
self.oStream.delegate = self;
[self.oStream scheduleInRunLoop:[NSRunLoop mainRunLoop]
forMode:NSDefaultRunLoopMode];
[self.oStream open];
[self.oStream write:[data bytes] maxLength:[data length]];
// CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CGSize imageSize = CVImageBufferGetEncodedSize(imageBuffer);
// also in the 'mediaSpecific' dict of the sampleBuffer
NSLog(@"frame captured at %.fx%.f", imageSize.width, imageSize.height);
}
您可能想看看您是否可以使用Open GL。把你的數據,轉換成GL紋理,然後用GL來顯示它。這可能是一個更高級別的API。數據不是以任何標準格式? – nielsbot
什麼是視頻格式?一個'UIView'?什麼是視頻的鏈接? – Larme
視頻格式爲AVCaptureSession – Eric