gpt4 book ai didi

ios - 如何将 iOS 8 中的 AVSampleBufferDisplayLayer 用于带有 GStreamer 的 RTP H264 流?

转载 作者:技术小花猫 更新时间:2023-10-29 10:18:04 27 4
gpt4 key购买 nike

在得知 iOS 8 中的程序员可以使用 HW-H264-Decoder 后,我想立即使用它。 WWDC 2014 对“直接访问视频编码和解码”有一个很好的介绍。你可以看看here .

基于那里的案例 1,我开始开发一个应用程序,它应该能够从 GStreamer 获取 H264-RTP-UDP-Stream,将其放入“appsink”元素中以直接访问 NAL 单元并进行转换以创建 CMSampleBuffers,这是我的 AVSampleBufferDisplayLayer然后可以显示。

执行所有操作的有趣代码如下:

//
// GStreamerBackend.m
//

#import "GStreamerBackend.h"

NSString * const naluTypesStrings[] = {
@"Unspecified (non-VCL)",
@"Coded slice of a non-IDR picture (VCL)",
@"Coded slice data partition A (VCL)",
@"Coded slice data partition B (VCL)",
@"Coded slice data partition C (VCL)",
@"Coded slice of an IDR picture (VCL)",
@"Supplemental enhancement information (SEI) (non-VCL)",
@"Sequence parameter set (non-VCL)",
@"Picture parameter set (non-VCL)",
@"Access unit delimiter (non-VCL)",
@"End of sequence (non-VCL)",
@"End of stream (non-VCL)",
@"Filler data (non-VCL)",
@"Sequence parameter set extension (non-VCL)",
@"Prefix NAL unit (non-VCL)",
@"Subset sequence parameter set (non-VCL)",
@"Reserved (non-VCL)",
@"Reserved (non-VCL)",
@"Reserved (non-VCL)",
@"Coded slice of an auxiliary coded picture without partitioning (non-VCL)",
@"Coded slice extension (non-VCL)",
@"Coded slice extension for depth view components (non-VCL)",
@"Reserved (non-VCL)",
@"Reserved (non-VCL)",
@"Unspecified (non-VCL)",
@"Unspecified (non-VCL)",
@"Unspecified (non-VCL)",
@"Unspecified (non-VCL)",
@"Unspecified (non-VCL)",
@"Unspecified (non-VCL)",
@"Unspecified (non-VCL)",
@"Unspecified (non-VCL)",
};


static GstFlowReturn new_sample(GstAppSink *sink, gpointer user_data)
{
GStreamerBackend *backend = (__bridge GStreamerBackend *)(user_data);
GstSample *sample = gst_app_sink_pull_sample(sink);
GstBuffer *buffer = gst_sample_get_buffer(sample);
GstMemory *memory = gst_buffer_get_all_memory(buffer);

GstMapInfo info;
gst_memory_map (memory, &info, GST_MAP_READ);

int startCodeIndex = 0;
for (int i = 0; i < 5; i++) {
if (info.data[i] == 0x01) {
startCodeIndex = i;
break;
}
}
int nalu_type = ((uint8_t)info.data[startCodeIndex + 1] & 0x1F);
NSLog(@"NALU with Type \"%@\" received.", naluTypesStrings[nalu_type]);
if(backend.searchForSPSAndPPS) {
if (nalu_type == 7)
backend.spsData = [NSData dataWithBytes:&(info.data[startCodeIndex + 1]) length: info.size - 4];

if (nalu_type == 8)
backend.ppsData = [NSData dataWithBytes:&(info.data[startCodeIndex + 1]) length: info.size - 4];

if (backend.spsData != nil && backend.ppsData != nil) {
const uint8_t* const parameterSetPointers[2] = { (const uint8_t*)[backend.spsData bytes], (const uint8_t*)[backend.ppsData bytes] };
const size_t parameterSetSizes[2] = { [backend.spsData length], [backend.ppsData length] };

CMVideoFormatDescriptionRef videoFormatDescr;
OSStatus status = CMVideoFormatDescriptionCreateFromH264ParameterSets(kCFAllocatorDefault, 2, parameterSetPointers, parameterSetSizes, 4, &videoFormatDescr);
[backend setVideoFormatDescr:videoFormatDescr];
[backend setSearchForSPSAndPPS:false];
NSLog(@"Found all data for CMVideoFormatDescription. Creation: %@.", (status == noErr) ? @"successfully." : @"failed.");
}
}
if (nalu_type == 1 || nalu_type == 5) {
CMBlockBufferRef videoBlock = NULL;
OSStatus status = CMBlockBufferCreateWithMemoryBlock(NULL, info.data, info.size, kCFAllocatorNull, NULL, 0, info.size, 0, &videoBlock);
NSLog(@"BlockBufferCreation: %@", (status == kCMBlockBufferNoErr) ? @"successfully." : @"failed.");
const uint8_t sourceBytes[] = {(uint8_t)(info.size >> 24), (uint8_t)(info.size >> 16), (uint8_t)(info.size >> 8), (uint8_t)info.size};
status = CMBlockBufferReplaceDataBytes(sourceBytes, videoBlock, 0, 4);
NSLog(@"BlockBufferReplace: %@", (status == kCMBlockBufferNoErr) ? @"successfully." : @"failed.");

CMSampleBufferRef sbRef = NULL;
const size_t sampleSizeArray[] = {info.size};

status = CMSampleBufferCreate(kCFAllocatorDefault, videoBlock, true, NULL, NULL, backend.videoFormatDescr, 1, 0, NULL, 1, sampleSizeArray, &sbRef);
NSLog(@"SampleBufferCreate: %@", (status == noErr) ? @"successfully." : @"failed.");

CFArrayRef attachments = CMSampleBufferGetSampleAttachmentsArray(sbRef, YES);
CFMutableDictionaryRef dict = (CFMutableDictionaryRef)CFArrayGetValueAtIndex(attachments, 0);
CFDictionarySetValue(dict, kCMSampleAttachmentKey_DisplayImmediately, kCFBooleanTrue);

NSLog(@"Error: %@, Status:%@", backend.displayLayer.error, (backend.displayLayer.status == AVQueuedSampleBufferRenderingStatusUnknown)?@"unknown":((backend.displayLayer.status == AVQueuedSampleBufferRenderingStatusRendering)?@"rendering":@"failed"));
dispatch_async(dispatch_get_main_queue(),^{
[backend.displayLayer enqueueSampleBuffer:sbRef];
[backend.displayLayer setNeedsDisplay];
});

}

gst_memory_unmap(memory, &info);
gst_memory_unref(memory);
gst_buffer_unref(buffer);

return GST_FLOW_OK;
}

@implementation GStreamerBackend

- (instancetype)init
{
if (self = [super init]) {
self.searchForSPSAndPPS = true;
self.ppsData = nil;
self.spsData = nil;
self.displayLayer = [[AVSampleBufferDisplayLayer alloc] init];
self.displayLayer.bounds = CGRectMake(0, 0, 300, 300);
self.displayLayer.backgroundColor = [UIColor blackColor].CGColor;
self.displayLayer.position = CGPointMake(500, 500);
self.queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
dispatch_async(self.queue, ^{
[self app_function];
});
}
return self;
}

- (void)start
{
if(gst_element_set_state(self.pipeline, GST_STATE_PLAYING) == GST_STATE_CHANGE_FAILURE) {
NSLog(@"Failed to set pipeline to playing");
}
}

- (void)app_function
{
GstElement *udpsrc, *rtphdepay, *capsfilter;
GMainContext *context; /* GLib context used to run the main loop */
GMainLoop *main_loop; /* GLib main loop */


context = g_main_context_new ();
g_main_context_push_thread_default(context);

g_set_application_name ("appsink");

self.pipeline = gst_pipeline_new ("testpipe");

udpsrc = gst_element_factory_make ("udpsrc", "udpsrc");
GstCaps *caps = gst_caps_new_simple("application/x-rtp", "media", G_TYPE_STRING, "video", "clock-rate", G_TYPE_INT, 90000, "encoding-name", G_TYPE_STRING, "H264", NULL);
g_object_set(udpsrc, "caps", caps, "port", 5000, NULL);
gst_caps_unref(caps);
rtphdepay = gst_element_factory_make("rtph264depay", "rtph264depay");
capsfilter = gst_element_factory_make("capsfilter", "capsfilter");
caps = gst_caps_new_simple("video/x-h264", "streamformat", G_TYPE_STRING, "byte-stream", "alignment", G_TYPE_STRING, "nal", NULL);
g_object_set(capsfilter, "caps", caps, NULL);
self.appsink = gst_element_factory_make ("appsink", "appsink");

gst_bin_add_many (GST_BIN (self.pipeline), udpsrc, rtphdepay, capsfilter, self.appsink, NULL);

if(!gst_element_link_many (udpsrc, rtphdepay, capsfilter, self.appsink, NULL)) {
NSLog(@"Cannot link gstreamer elements");
exit (1);
}

if(gst_element_set_state(self.pipeline, GST_STATE_READY) != GST_STATE_CHANGE_SUCCESS)
NSLog(@"could not change to ready");

GstAppSinkCallbacks callbacks = { NULL, NULL, new_sample,
NULL, NULL};
gst_app_sink_set_callbacks (GST_APP_SINK(self.appsink), &callbacks, (__bridge gpointer)(self), NULL);

main_loop = g_main_loop_new (context, FALSE);
g_main_loop_run (main_loop);


/* Free resources */
g_main_loop_unref (main_loop);
main_loop = NULL;
g_main_context_pop_thread_default(context);
g_main_context_unref (context);
gst_element_set_state (GST_ELEMENT (self.pipeline), GST_STATE_NULL);
gst_object_unref (GST_OBJECT (self.pipeline));
}

@end

运行应用程序并开始流式传输到 iOS 设备时我得到的结果:

NALU with Type "Sequence parameter set (non-VCL)" received.
NALU with Type "Picture parameter set (non-VCL)" received.

Found all data for CMVideoFormatDescription. Creation: successfully..

NALU with Type "Coded slice of an IDR picture (VCL)" received.
BlockBufferCreation: successfully.
BlockBufferReplace: successfully.
SampleBufferCreate: successfully.
Error: (null), Status:unknown

NALU with Type "Coded slice of a non-IDR picture (VCL)" received.
BlockBufferCreation: successfully.
BlockBufferReplace: successfully.
SampleBufferCreate: successfully.
Error: (null), Status:rendering
[...] (repetition of the last 5 lines)

所以它似乎按照它应该的方式解码,但我的问题是,我在我的 AVSampleBufferDisplayLayer 中看不到任何东西。kCMSampleAttachmentKey_DisplayImmediately 可能有问题, 但我已经按照我被告知的方式设置了 here (see the 'important' note) .

欢迎任何想法;)

最佳答案

现在可以使用了。每个 NALU 的长度不包含长度头本身。所以在将它用于我的 sourceBytes 之前,我已经从我的 info.size 中减去 4。

关于ios - 如何将 iOS 8 中的 AVSampleBufferDisplayLayer 用于带有 GStreamer 的 RTP H264 流?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/25980070/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com