gpt4 book ai didi

ios - 音频 CMSampleBuffer 的深拷贝

转载 作者:IT王子 更新时间:2023-10-29 05:23:46 25 4
gpt4 key购买 nike

我正在尝试创建由 AVCaptureAudioDataOutputSampleBufferDelegate 中的 captureOutput 返回的 CMSampleBuffer 副本.

我遇到的问题是我的框架来自委托(delegate)方法 captureOutput:didOutputSampleBuffer:fromConnection:在我将它们保留在 CFArray 中后被删除很长一段时间。

显然,我需要创建传入缓冲区的深拷贝以供进一步处理。我也知道 CMSampleBufferCreateCopy只创建浅拷贝。

关于 SO 的相关问题很少:

但它们都不能帮助我正确使用 CMSampleBufferCreate具有 12 个参数的函数:

  CMSampleBufferRef copyBuffer;

CMBlockBufferRef data = CMSampleBufferGetDataBuffer(sampleBuffer);
CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer);
CMItemCount itemCount = CMSampleBufferGetNumSamples(sampleBuffer);

CMTime duration = CMSampleBufferGetDuration(sampleBuffer);
CMTime presentationStamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
CMSampleTimingInfo timingInfo;
timingInfo.duration = duration;
timingInfo.presentationTimeStamp = presentationStamp;
timingInfo.decodeTimeStamp = CMSampleBufferGetDecodeTimeStamp(sampleBuffer);


size_t sampleSize = CMBlockBufferGetDataLength(data);
CMBlockBufferRef sampleData;

if (CMBlockBufferCopyDataBytes(data, 0, sampleSize, &sampleData) != kCMBlockBufferNoErr) {
VLog(@"error during copying sample buffer");
}

// Here I tried data and sampleData CMBlockBuffer instance, but no success
OSStatus status = CMSampleBufferCreate(kCFAllocatorDefault, data, isDataReady, nil, nil, formatDescription, itemCount, 1, &timingInfo, 1, &sampleSize, &copyBuffer);

if (!self.sampleBufferArray) {
self.sampleBufferArray = CFArrayCreateMutable(NULL, 0, &kCFTypeArrayCallBacks);
//EXC_BAD_ACCESS crash when trying to add sampleBuffer to the array
CFArrayAppendValue(self.sampleBufferArray, copyBuffer);
} else {
CFArrayAppendValue(self.sampleBufferArray, copyBuffer);
}

如何深拷贝Audio CMSampleBuffer?在您的答案中随意使用任何语言(swift/objective-c)。

最佳答案

这是我最终实现的可行解决方案。我将此片段发送给 Apple 开发人员技术支持,并要求他们检查复制传入样本缓冲区的方法是否正确。基本思想是复制 AudioBufferList 然后创建一个 CMSampleBuffer 并将 AudioBufferList 设置为此示例。

AudioBufferList audioBufferList;
CMBlockBufferRef blockBuffer;
//Create an AudioBufferList containing the data from the CMSampleBuffer,
//and a CMBlockBuffer which references the data in that AudioBufferList.
CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer, NULL, &audioBufferList, sizeof(audioBufferList), NULL, NULL, 0, &blockBuffer);
NSUInteger size = sizeof(audioBufferList);
char buffer[size];

memcpy(buffer, &audioBufferList, size);
//This is the Audio data.
NSData *bufferData = [NSData dataWithBytes:buffer length:size];

const void *copyBufferData = [bufferData bytes];
copyBufferData = (char *)copyBufferData;

CMSampleBufferRef copyBuffer = NULL;
OSStatus status = -1;

/* Format Description */

AudioStreamBasicDescription audioFormat = *CMAudioFormatDescriptionGetStreamBasicDescription((CMAudioFormatDescriptionRef) CMSampleBufferGetFormatDescription(sampleBuffer));

CMFormatDescriptionRef format = NULL;
status = CMAudioFormatDescriptionCreate(kCFAllocatorDefault, &audioFormat, 0, nil, 0, nil, nil, &format);

CMFormatDescriptionRef formatdes = NULL;
status = CMFormatDescriptionCreate(NULL, kCMMediaType_Audio, 'lpcm', NULL, &formatdes);
if (status != noErr)
{
NSLog(@"Error in CMAudioFormatDescriptionCreator");
CFRelease(blockBuffer);
return;
}

/* Create sample Buffer */
CMItemCount framesCount = CMSampleBufferGetNumSamples(sampleBuffer);
CMSampleTimingInfo timing = {.duration= CMTimeMake(1, 44100), .presentationTimeStamp= CMSampleBufferGetPresentationTimeStamp(sampleBuffer), .decodeTimeStamp= CMSampleBufferGetDecodeTimeStamp(sampleBuffer)};

status = CMSampleBufferCreate(kCFAllocatorDefault, nil , NO,nil,nil,format, framesCount, 1, &timing, 0, nil, &copyBuffer);

if( status != noErr) {
NSLog(@"Error in CMSampleBufferCreate");
CFRelease(blockBuffer);
return;
}

/* Copy BufferList to Sample Buffer */
AudioBufferList receivedAudioBufferList;
memcpy(&receivedAudioBufferList, copyBufferData, sizeof(receivedAudioBufferList));

//Creates a CMBlockBuffer containing a copy of the data from the
//AudioBufferList.
status = CMSampleBufferSetDataBufferFromAudioBufferList(copyBuffer, kCFAllocatorDefault , kCFAllocatorDefault, 0, &receivedAudioBufferList);
if (status != noErr) {
NSLog(@"Error in CMSampleBufferSetDataBufferFromAudioBufferList");
CFRelease(blockBuffer);
return;
}

代码级支持回答:

This code looks ok (though you’ll want to add some additional error checking). I've successfully tested it in an app that implements the AVCaptureAudioDataOutput delegate captureOutput:didOutputSampleBuffer:fromConnection: method to capture and record audio. The captured audio I'm getting when using this deep copy code appears to be the same as what I get when directly using the provided sample buffer (without the deep copy).

Apple Developer Technical Support

关于ios - 音频 CMSampleBuffer 的深拷贝,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/46908485/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com