2014-09-22 67 views
14

在獲得iOS 8程序員可以使用的HW-H264解碼器的通知之後,我想現在就使用它。在這裏有一篇很好的介紹來自WWDC 2014的'Direct Access to Video Encoding and Decoding'。你可以看看here如何使用GStreamer在iOS 8中使用AVSampleBufferDisplayLayer進行RTP H264流處理?

根據案例1,我開始開發一個應用程序,它應該能夠從GStreamer獲得H264-RTP-UDP-Stream,將其沉入一個'appsink'元素中以直接訪問NAL單位並做轉換來創建CMSampleBuffers,我的AVSampleBufferDisplayLayer然後可以顯示。

有趣的一段代碼做的一切是:

// 
// GStreamerBackend.m 
// 

#import "GStreamerBackend.h" 

NSString * const naluTypesStrings[] = { 
    @"Unspecified (non-VCL)", 
    @"Coded slice of a non-IDR picture (VCL)", 
    @"Coded slice data partition A (VCL)", 
    @"Coded slice data partition B (VCL)", 
    @"Coded slice data partition C (VCL)", 
    @"Coded slice of an IDR picture (VCL)", 
    @"Supplemental enhancement information (SEI) (non-VCL)", 
    @"Sequence parameter set (non-VCL)", 
    @"Picture parameter set (non-VCL)", 
    @"Access unit delimiter (non-VCL)", 
    @"End of sequence (non-VCL)", 
    @"End of stream (non-VCL)", 
    @"Filler data (non-VCL)", 
    @"Sequence parameter set extension (non-VCL)", 
    @"Prefix NAL unit (non-VCL)", 
    @"Subset sequence parameter set (non-VCL)", 
    @"Reserved (non-VCL)", 
    @"Reserved (non-VCL)", 
    @"Reserved (non-VCL)", 
    @"Coded slice of an auxiliary coded picture without partitioning (non-VCL)", 
    @"Coded slice extension (non-VCL)", 
    @"Coded slice extension for depth view components (non-VCL)", 
    @"Reserved (non-VCL)", 
    @"Reserved (non-VCL)", 
    @"Unspecified (non-VCL)", 
    @"Unspecified (non-VCL)", 
    @"Unspecified (non-VCL)", 
    @"Unspecified (non-VCL)", 
    @"Unspecified (non-VCL)", 
    @"Unspecified (non-VCL)", 
    @"Unspecified (non-VCL)", 
    @"Unspecified (non-VCL)", 
}; 


static GstFlowReturn new_sample(GstAppSink *sink, gpointer user_data) 
{ 
    GStreamerBackend *backend = (__bridge GStreamerBackend *)(user_data); 
    GstSample *sample = gst_app_sink_pull_sample(sink); 
    GstBuffer *buffer = gst_sample_get_buffer(sample); 
    GstMemory *memory = gst_buffer_get_all_memory(buffer); 

    GstMapInfo info; 
    gst_memory_map (memory, &info, GST_MAP_READ); 

    int startCodeIndex = 0; 
    for (int i = 0; i < 5; i++) { 
     if (info.data[i] == 0x01) { 
      startCodeIndex = i; 
      break; 
     } 
    } 
    int nalu_type = ((uint8_t)info.data[startCodeIndex + 1] & 0x1F); 
    NSLog(@"NALU with Type \"%@\" received.", naluTypesStrings[nalu_type]); 
    if(backend.searchForSPSAndPPS) { 
     if (nalu_type == 7) 
      backend.spsData = [NSData dataWithBytes:&(info.data[startCodeIndex + 1]) length: info.size - 4]; 

     if (nalu_type == 8) 
      backend.ppsData = [NSData dataWithBytes:&(info.data[startCodeIndex + 1]) length: info.size - 4]; 

     if (backend.spsData != nil && backend.ppsData != nil) { 
      const uint8_t* const parameterSetPointers[2] = { (const uint8_t*)[backend.spsData bytes], (const uint8_t*)[backend.ppsData bytes] }; 
      const size_t parameterSetSizes[2] = { [backend.spsData length], [backend.ppsData length] }; 

      CMVideoFormatDescriptionRef videoFormatDescr; 
      OSStatus status = CMVideoFormatDescriptionCreateFromH264ParameterSets(kCFAllocatorDefault, 2, parameterSetPointers, parameterSetSizes, 4, &videoFormatDescr); 
      [backend setVideoFormatDescr:videoFormatDescr]; 
      [backend setSearchForSPSAndPPS:false]; 
      NSLog(@"Found all data for CMVideoFormatDescription. Creation: %@.", (status == noErr) ? @"successfully." : @"failed."); 
     } 
    } 
    if (nalu_type == 1 || nalu_type == 5) { 
     CMBlockBufferRef videoBlock = NULL; 
     OSStatus status = CMBlockBufferCreateWithMemoryBlock(NULL, info.data, info.size, kCFAllocatorNull, NULL, 0, info.size, 0, &videoBlock); 
     NSLog(@"BlockBufferCreation: %@", (status == kCMBlockBufferNoErr) ? @"successfully." : @"failed."); 
     const uint8_t sourceBytes[] = {(uint8_t)(info.size >> 24), (uint8_t)(info.size >> 16), (uint8_t)(info.size >> 8), (uint8_t)info.size}; 
     status = CMBlockBufferReplaceDataBytes(sourceBytes, videoBlock, 0, 4); 
     NSLog(@"BlockBufferReplace: %@", (status == kCMBlockBufferNoErr) ? @"successfully." : @"failed."); 

     CMSampleBufferRef sbRef = NULL; 
     const size_t sampleSizeArray[] = {info.size}; 

     status = CMSampleBufferCreate(kCFAllocatorDefault, videoBlock, true, NULL, NULL, backend.videoFormatDescr, 1, 0, NULL, 1, sampleSizeArray, &sbRef); 
     NSLog(@"SampleBufferCreate: %@", (status == noErr) ? @"successfully." : @"failed."); 

     CFArrayRef attachments = CMSampleBufferGetSampleAttachmentsArray(sbRef, YES); 
     CFMutableDictionaryRef dict = (CFMutableDictionaryRef)CFArrayGetValueAtIndex(attachments, 0); 
     CFDictionarySetValue(dict, kCMSampleAttachmentKey_DisplayImmediately, kCFBooleanTrue); 

     NSLog(@"Error: %@, Status:%@", backend.displayLayer.error, (backend.displayLayer.status == AVQueuedSampleBufferRenderingStatusUnknown)[email protected]"unknown":((backend.displayLayer.status == AVQueuedSampleBufferRenderingStatusRendering)[email protected]"rendering":@"failed")); 
     dispatch_async(dispatch_get_main_queue(),^{ 
      [backend.displayLayer enqueueSampleBuffer:sbRef]; 
      [backend.displayLayer setNeedsDisplay]; 
     }); 

    } 

    gst_memory_unmap(memory, &info); 
    gst_memory_unref(memory); 
    gst_buffer_unref(buffer); 

    return GST_FLOW_OK; 
} 

@implementation GStreamerBackend 

- (instancetype)init 
{ 
    if (self = [super init]) { 
     self.searchForSPSAndPPS = true; 
     self.ppsData = nil; 
     self.spsData = nil; 
     self.displayLayer = [[AVSampleBufferDisplayLayer alloc] init]; 
     self.displayLayer.bounds = CGRectMake(0, 0, 300, 300); 
     self.displayLayer.backgroundColor = [UIColor blackColor].CGColor; 
     self.displayLayer.position = CGPointMake(500, 500); 
     self.queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0); 
     dispatch_async(self.queue, ^{ 
      [self app_function]; 
     }); 
    } 
    return self; 
} 

- (void)start 
{ 
    if(gst_element_set_state(self.pipeline, GST_STATE_PLAYING) == GST_STATE_CHANGE_FAILURE) { 
     NSLog(@"Failed to set pipeline to playing"); 
    } 
} 

- (void)app_function 
{ 
    GstElement *udpsrc, *rtphdepay, *capsfilter; 
    GMainContext *context; /* GLib context used to run the main loop */ 
    GMainLoop *main_loop; /* GLib main loop */ 


    context = g_main_context_new(); 
    g_main_context_push_thread_default(context); 

    g_set_application_name ("appsink"); 

    self.pipeline = gst_pipeline_new ("testpipe"); 

    udpsrc = gst_element_factory_make ("udpsrc", "udpsrc"); 
    GstCaps *caps = gst_caps_new_simple("application/x-rtp", "media", G_TYPE_STRING, "video", "clock-rate", G_TYPE_INT, 90000, "encoding-name", G_TYPE_STRING, "H264", NULL); 
    g_object_set(udpsrc, "caps", caps, "port", 5000, NULL); 
    gst_caps_unref(caps); 
    rtphdepay = gst_element_factory_make("rtph264depay", "rtph264depay"); 
    capsfilter = gst_element_factory_make("capsfilter", "capsfilter"); 
    caps = gst_caps_new_simple("video/x-h264", "streamformat", G_TYPE_STRING, "byte-stream", "alignment", G_TYPE_STRING, "nal", NULL); 
    g_object_set(capsfilter, "caps", caps, NULL); 
    self.appsink = gst_element_factory_make ("appsink", "appsink"); 

    gst_bin_add_many (GST_BIN (self.pipeline), udpsrc, rtphdepay, capsfilter, self.appsink, NULL); 

    if(!gst_element_link_many (udpsrc, rtphdepay, capsfilter, self.appsink, NULL)) { 
     NSLog(@"Cannot link gstreamer elements"); 
     exit (1); 
    } 

    if(gst_element_set_state(self.pipeline, GST_STATE_READY) != GST_STATE_CHANGE_SUCCESS) 
     NSLog(@"could not change to ready"); 

    GstAppSinkCallbacks callbacks = { NULL, NULL, new_sample, 
     NULL, NULL}; 
    gst_app_sink_set_callbacks (GST_APP_SINK(self.appsink), &callbacks, (__bridge gpointer)(self), NULL); 

    main_loop = g_main_loop_new (context, FALSE); 
    g_main_loop_run (main_loop); 


    /* Free resources */ 
    g_main_loop_unref (main_loop); 
    main_loop = NULL; 
    g_main_context_pop_thread_default(context); 
    g_main_context_unref (context); 
    gst_element_set_state (GST_ELEMENT (self.pipeline), GST_STATE_NULL); 
    gst_object_unref (GST_OBJECT (self.pipeline)); 
} 

@end 

我能得到什麼運行App,並開始流式傳輸到iOS設備時:

NALU with Type "Sequence parameter set (non-VCL)" received. 
NALU with Type "Picture parameter set (non-VCL)" received. 

Found all data for CMVideoFormatDescription. Creation: successfully.. 

NALU with Type "Coded slice of an IDR picture (VCL)" received. 
BlockBufferCreation: successfully. 
BlockBufferReplace: successfully. 
SampleBufferCreate: successfully. 
Error: (null), Status:unknown 

NALU with Type "Coded slice of a non-IDR picture (VCL)" received. 
BlockBufferCreation: successfully. 
BlockBufferReplace: successfully. 
SampleBufferCreate: successfully. 
Error: (null), Status:rendering 
[...] (repetition of the last 5 lines) 

所以它似乎解碼,因爲它應該這樣做,但我的問題是,我無法看到我的AVSampleBufferDisplayLayer中的任何東西。 這可能是kCMSampleAttachmentKey_DisplayImmediately的問題,但我已經設置好了,就像我被告知here (see the 'important' note)

每個想法都歡迎;)

+0

我幾乎完成了這個確切的事情,但你爲什麼要檢查開始代碼?是不是隻有在字節流(如果它是通過TCP)。我想如果通過RTP(UDP)打包,所以不再需要啓動代碼。這個[RFC](https://tools.ietf.org/html/rfc6184)是我在這個過程中學到的所有東西,它沒有提到尋找一個開始代碼,因爲它在數據包中。我知道你發佈了一個鏈接的視頻確實提到了這一點,但我總是困惑他們爲什麼互相沖突。 – ddelnano 2015-02-13 17:58:10

+0

我不確定規格說的是什麼。但是因爲我在訪問流之前使用了GStreamer,並且特別指定了NALUs作爲輸出,所以GStreamer可以將它轉換爲最初不在UDP數據包中的任何東西。因此,即使在UDP數據包中不存在起始碼,GStreamer也可以添加起始碼。 – Zappel 2015-03-05 11:07:01

+0

我是否正確地說您的代碼查找起始代碼0x0001或0x000001?在流媒體服務器端,您是否將gstreamer用作命令行實用程序?如果是這樣,你能告訴我你使用了什麼命令? – ddelnano 2015-03-05 15:18:28

回答

2

現在就工作吧。每個NALU的長度不包含長度頭本身。所以我在從我的info.size中減去4之前,將它用於我的sourceBytes。

0

由您的代碼指示,我使用AVSampleBufferDisplayLayer編寫了一個解碼和顯示實時H.264流的程序。我使用live555而不是GSStream來接收H.264 NAL單元。

不幸的是,我的應用程序只顯示幾幀,然後沒有圖像可以顯示更多。你的應用程序遇到過同樣的問題嗎?

+0

不,我的應用程序沒有任何這些問題。你評估過CM功能的OSStatus回報嗎?如果是的話,他們的價值是什麼?爲我們提供更多幫助。也許你應該看看你的應用程序的線程安全。嘗試首先在主線程中實現它。 – Zappel 2014-09-29 10:56:25