gpt4 book ai didi

ios - 从 AVPlayer 获取 HLS 的 PCM 数据

转载 作者:IT王子 更新时间:2023-10-29 07:49:14 26 4
gpt4 key购买 nike

这个问题在过去几年似乎被问过几次,但没有人回答。我正在尝试处理来自 HLS 的 PCM 数据,我必须使用 AVPlayer。

这篇文章挖掘了本地文件 https://chritto.wordpress.com/2013/01/07/processing-avplayers-audio-with-mtaudioprocessingtap/

并且此分路器适用于远程文件,但不适用于 .m3u8 hls 文件。 http://venodesigns.net/2014/01/08/recording-live-audio-streams-on-ios/

我可以播放播放列表中的前两首轨道,但它不会启动获取 pcm 所需的回调,当文件是本地或远程(不是流)时,我仍然可以获得 pcm,但它不是 hls工作,我需要 HLS 工作

这是我的代码

//avplayer tap try
- (void)viewDidLoad {
[super viewDidLoad];

NSURL*testUrl= [NSURL URLWithString:@"http://playlists.ihrhls.com/c5/1469/playlist.m3u8"];

AVPlayerItem *item = [AVPlayerItem playerItemWithURL:testUrl];
self.player = [AVPlayer playerWithPlayerItem:item];

// Watch the status property - when this is good to go, we can access the
// underlying AVAssetTrack we need.
[item addObserver:self forKeyPath:@"status" options:0 context:nil];

}

-(void)observeValueForKeyPath:(NSString *)keyPath
ofObject:(id)object
change:(NSDictionary *)change
context:(void *)context
{
if(![keyPath isEqualToString:@"status"])
return;

AVPlayerItem *item = (AVPlayerItem *)object;
if(item.status != AVPlayerItemStatusReadyToPlay)
return;

NSArray *tracks = [self.player.currentItem tracks];
for(AVPlayerItemTrack *track in tracks) {
if([track.assetTrack.mediaType isEqualToString:AVMediaTypeAudio]) {
NSLog(@"GOT DAT FUCKER");
[self beginRecordingAudioFromTrack:track.assetTrack];
[self.player play];
}
}
}

- (void)beginRecordingAudioFromTrack:(AVAssetTrack *)audioTrack
{
// Configure an MTAudioProcessingTap to handle things.
MTAudioProcessingTapRef tap;
MTAudioProcessingTapCallbacks callbacks;
callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
callbacks.clientInfo = (__bridge void *)(self);
callbacks.init = init;
callbacks.prepare = prepare;
callbacks.process = process;
callbacks.unprepare = unprepare;
callbacks.finalize = finalize;

OSStatus err = MTAudioProcessingTapCreate(
kCFAllocatorDefault,
&callbacks,
kMTAudioProcessingTapCreationFlag_PostEffects,
&tap
);

if(err) {
NSLog(@"Unable to create the Audio Processing Tap %d", (int)err);
return;
}

// Create an AudioMix and assign it to our currently playing "item", which
// is just the stream itself.
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters
audioMixInputParametersWithTrack:audioTrack];

inputParams.audioTapProcessor = tap;
audioMix.inputParameters = @[inputParams];
self.player.currentItem.audioMix = audioMix;
}

void process(MTAudioProcessingTapRef tap, CMItemCount numberFrames,
MTAudioProcessingTapFlags flags, AudioBufferList *bufferListInOut,
CMItemCount *numberFramesOut, MTAudioProcessingTapFlags *flagsOut)
{
OSStatus err = MTAudioProcessingTapGetSourceAudio(tap, numberFrames, bufferListInOut,
flagsOut, NULL, numberFramesOut);
if (err) NSLog(@"Error from GetSourceAudio: %d", (int)err);

NSLog(@"Process");

}

void init(MTAudioProcessingTapRef tap, void *clientInfo, void **tapStorageOut)
{
NSLog(@"Initialising the Audio Tap Processor");
*tapStorageOut = clientInfo;
}

void finalize(MTAudioProcessingTapRef tap)
{
NSLog(@"Finalizing the Audio Tap Processor");
}

void prepare(MTAudioProcessingTapRef tap, CMItemCount maxFrames, const AudioStreamBasicDescription *processingFormat)
{
NSLog(@"Preparing the Audio Tap Processor");
}

void unprepare(MTAudioProcessingTapRef tap)
{
NSLog(@"Unpreparing the Audio Tap Processor");
}

void init 被称为 void prepare 并且 process 也必须被调用。

我该怎么做?

最佳答案

我建议您使用 FFMPEG 库来处理 HLS 流。这有点难,但提供了更大的灵 active 。几年前我为 Android 开发了 HLS Player ( used this project ) 我相信这同样适用于 iOS。

关于ios - 从 AVPlayer 获取 HLS 的 PCM 数据,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/29040484/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com