gpt4 book ai didi

ios - 在 AVFoundation AVSampleBufferDisplayLayer 中循环播放视频

转载 作者:可可西里 更新时间:2023-11-01 04:57:18 27 4
gpt4 key购买 nike

我正在尝试在 AVSampleBufferDisplayLayer 上循环播放视频。我可以毫无问题地播放一次。但是,当我尝试循环播放时,它并没有继续播放。

根据对 AVFoundation to reproduce a video loop 的回答没有办法倒带 AVAssetReader 所以我重新创建它。 (我确实看到了 Looping a video with AVFoundation AVPlayer? 的答案,但 AVPlayer 的功能更全。我正在阅读一个文件,但仍然想要 AVSampleBufferDisplayLayer。)

一个假设是我需要停止一些 H264 header ,但我不知道这是否有帮助(以及如何帮助)。还有一个就是跟CMTimebase有关系,我试了好几种都不行。

下面的代码,基于 Apple 关于直接访问视频编码的 WWDC 演讲:

- (void)viewDidLoad {
[super viewDidLoad];

NSString *filepath = [[NSBundle mainBundle] pathForResource:@"sample-mp4" ofType:@"mp4"];
NSURL *fileURL = [NSURL fileURLWithPath:filepath];
AVAsset *asset = [AVURLAsset URLAssetWithURL:fileURL options:nil];

UIView *view = self.view;

self.videoLayer = [[AVSampleBufferDisplayLayer alloc] init];
self.videoLayer.bounds = view.bounds;
self.videoLayer.position = CGPointMake(CGRectGetMidX(view.bounds), CGRectGetMidY(view.bounds));
self.videoLayer.videoGravity = AVLayerVideoGravityResizeAspect;
self.videoLayer.backgroundColor = [[UIColor greenColor] CGColor];

CMTimebaseRef controlTimebase;
CMTimebaseCreateWithMasterClock( CFAllocatorGetDefault(), CMClockGetHostTimeClock(), &controlTimebase );

self.videoLayer.controlTimebase = controlTimebase;
CMTimebaseSetTime(self.videoLayer.controlTimebase, CMTimeMake(5, 1));
CMTimebaseSetRate(self.videoLayer.controlTimebase, 1.0);

[[view layer] addSublayer:_videoLayer];

dispatch_queue_t assetQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0); //??? right queue?


__block AVAssetReader *assetReaderVideo = [self createAssetReader:asset];
__block AVAssetReaderTrackOutput *outVideo = [assetReaderVideo outputs][0];
if( [assetReaderVideo startReading] )
{
[_videoLayer requestMediaDataWhenReadyOnQueue: assetQueue usingBlock: ^{
while( [_videoLayer isReadyForMoreMediaData] )
{
CMSampleBufferRef sampleVideo;
if ( ([assetReaderVideo status] == AVAssetReaderStatusReading) && ( sampleVideo = [outVideo copyNextSampleBuffer]) ) {
[_videoLayer enqueueSampleBuffer:sampleVideo];
CFRelease(sampleVideo);
CMTimeShow(CMTimebaseGetTime(_videoLayer.controlTimebase));
}
else {

[_videoLayer stopRequestingMediaData];
//CMTimebaseSetTime(_videoLayer.controlTimebase, CMTimeMake(5, 1));
//CMTimebaseSetRate(self.videoLayer.controlTimebase, 1.0);
//CMTimeShow(CMTimebaseGetTime(_videoLayer.controlTimebase));
assetReaderVideo = [self createAssetReader:asset];
outVideo = [assetReaderVideo outputs][0];
[assetReaderVideo startReading];
//sampleVideo = [outVideo copyNextSampleBuffer];

//[_videoLayer enqueueSampleBuffer:sampleVideo];
}
}
}];
}
}

-(AVAssetReader *)createAssetReader:(AVAsset*)asset {
NSError *error=nil;

AVAssetReader *assetReaderVideo = [[AVAssetReader alloc] initWithAsset:asset error:&error];

NSArray *videoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
AVAssetReaderTrackOutput *outVideo = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTracks[0] outputSettings:nil]; //dic];
[outVideo res]

[assetReaderVideo addOutput:outVideo];
return assetReaderVideo;
}

非常感谢。

最佳答案

尝试使用 swift 进行循环,然后将 objective-c 文件与 swift 文件连接起来。 google 有很多关于桥接和循环的答案,所以只需用 swift google 一下。

关于ios - 在 AVFoundation AVSampleBufferDisplayLayer 中循环播放视频,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/28616624/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com