gpt4 book ai didi

opencv - HTTP 直播流 AVAsset

转载 作者:太空宇宙 更新时间:2023-11-03 22:59:20 24 4
gpt4 key购买 nike

我正在使用 avplayer 在 OSX 上实现一个 HTTP 实时流媒体播放器。我能够正确地流式传输它寻求并获得持续时间等。现在我想拍摄屏幕截图并使用 OpenCV 处理其中的帧。我去使用 AVASSetImageGenerator。但是与player.currentItem关联的AVAsset没有音视频轨道。

轨道出现在 player.currentItem.tracks 中。所以我不能起诉 AVAssetGenerator。任何人都可以帮助找到在这种情况下提取屏幕截图和单个帧的解决方案吗?

请在下面找到我如何启动 HTTP 直播流的代码

提前致谢。

NSURL* url = [NSURL URLWithString:@"http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8"];
playeritem = [AVPlayerItem playerItemWithURL:url];

[playeritem addObserver:self forKeyPath:@"status" options:0 context:AVSPPlayerStatusContext];
[self setPlayer:[AVPlayer playerWithPlayerItem:playeritem]];
[self addObserver:self forKeyPath:@"player.rate" options:NSKeyValueObservingOptionNew context:AVSPPlayerRateContext];
[self addObserver:self forKeyPath:@"player.currentItem.status" options:NSKeyValueObservingOptionNew context:AVSPPlayerItemStatusContext];
AVPlayerLayer *newPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:[self player]];
[newPlayerLayer setFrame:[[[self playerView] layer] bounds]];
[newPlayerLayer setAutoresizingMask:kCALayerWidthSizable | kCALayerHeightSizable];
[newPlayerLayer setHidden:YES];
[[[self playerView] layer] addSublayer:newPlayerLayer];
[self setPlayerLayer:newPlayerLayer];
[self addObserver:self forKeyPath:@"playerLayer.readyForDisplay" options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew context:AVSPPlayerLayerReadyForDisplay];
[self.player play];

以下是我如何检查 Assets 是否存在视频轨道

case AVPlayerItemStatusReadyToPlay:

[self setTimeObserverToken:[[self player] addPeriodicTimeObserverForInterval:CMTimeMake(1, 10) queue:dispatch_get_main_queue() usingBlock:^(CMTime time) {
[[self timeSlider] setDoubleValue:CMTimeGetSeconds(time)];
NSLog(@"%f,%f,%f",[self currentTime],[self duration],[[self player] rate]);
AVPlayerItem *item = playeritem;
if(item.status == AVPlayerItemStatusReadyToPlay)
{
AVAsset *asset = (AVAsset *)item.asset;
long audiotracks = [[asset tracks] count];
long videotracks = [[asset availableMediaCharacteristicsWithMediaSelectionOptions]count];

NSLog(@"Track info Audio = %ld,Video=%ld",audiotracks,videotracks);
}
}]];



AVPlayerItem *item = self.player.currentItem;
if(item.status != AVPlayerItemStatusReadyToPlay)
return;
AVURLAsset *asset = (AVURLAsset *)item.asset;
long audiotracks = [[asset tracksWithMediaType:AVMediaTypeAudio]count];
long videotracks = [[asset tracksWithMediaType:AVMediaTypeVideo]count];

NSLog(@"Track info Audio = %ld,Video=%ld",audiotracks,videotracks);

最佳答案

这是一个较老的问题,但如果有人需要帮助,我有一个答案

AVURLAsset *asset = /* Your Asset here! */;
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.requestedTimeToleranceAfter = kCMTimeZero;
generator.requestedTimeToleranceBefore = kCMTimeZero;
for (Float64 i = 0; i < CMTimeGetSeconds(asset.duration) * /* Put the FPS of the source video here */ ; i++){
@autoreleasepool {
CMTime time = CMTimeMake(i, /* Put the FPS of the source video here */);

NSError *err;
CMTime actualTime;
CGImageRef image = [generator copyCGImageAtTime:time actualTime:&actualTime error:&err];

// Do what you want with the image, for example save it as UIImage
UIImage *generatedImage = [[UIImage alloc] initWithCGImage:image];

CGImageRelease(image);
}
}

您可以使用以下代码轻松获取视频的 FPS:

float fps=0.00;
if (asset) {
AVAssetTrack * videoATrack = [asset tracksWithMediaType:AVMediaTypeVideo][0];
if(videoATrack)
{
fps = [videoATrack nominalFrameRate];
}
}

希望对询问如何从视频中获取所有帧或仅获取某些特定帧(例如 CMTime)的人有所帮助。请记住,将所有帧保存到数组中几乎不会影响内存!

关于opencv - HTTP 直播流 AVAsset,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/22217195/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com