gpt4 book ai didi

objective-c - 在播放期间更改 AVPlayerItem 的 videoComposition 属性 (AVMutableVideoComposition)

转载 作者:太空狗 更新时间:2023-10-30 03:45:37 26 4
gpt4 key购买 nike

我正在尝试在播放期间更改视频合成的布局(即其组件帧的变换)。似乎有时这行得通,并且视频合成无缝地更改为新的变换集,但其他时候它只是卡住并保持当前变换。 AVPlayer 实例上没有状态代码更改,播放器或播放器项目上也没有错误。

有没有人遇到过这种情况?任何关于为什么会发生这种情况或如何解决它的建议,我们将不胜感激。

部分代码如下所示。重要的一点是“playerItem.videoComposition = videoComposition”,这里是在点击视频时触发的(出于测试目的)。

此问题的另一种解决方案是在单独的图层上显示视频,但视频必须同步,因此合成似乎是实现此目的的唯一方法。

@implementation VideoView
{
CGSize _videoSize;
CMTimeRange _videoFullRange;

AVMutableCompositionTrack * _compositionTrackVideoA;
AVMutableCompositionTrack * _compositionTrackVideoB;
}

+ (Class)layerClass
{
return [AVPlayerLayer class];
}

- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if ( self )
{
NSString * videoAPath = [[NSBundle mainBundle] pathForResource:@"cam09v2" ofType:@"mp4"];
NSString * videoBPath = [[NSBundle mainBundle] pathForResource:@"cam10v2_b" ofType:@"mp4"];
AVURLAsset * videoAAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:videoAPath] options:nil];
AVURLAsset * videoBAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:videoBPath] options:nil];

AVAssetTrack * videoATrack = [[videoAAsset tracksWithMediaType:AVMediaTypeVideo] lastObject];
AVAssetTrack * videoBTrack = [[videoBAsset tracksWithMediaType:AVMediaTypeVideo] lastObject];
AVAssetTrack * audioTrack = [[videoAAsset tracksWithMediaType:AVMediaTypeAudio] lastObject];

_videoSize = [videoATrack naturalSize];
CMTime videoDuration = videoAAsset.duration;
_videoFullRange = CMTimeRangeMake(kCMTimeZero, videoDuration);

AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableCompositionTrack * compositionTrackVideoA = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack * compositionTrackVideoB = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack * compositionTrackAudio = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

compositionTrackVideoA.preferredTransform = videoATrack.preferredTransform;

NSError * error = nil;
if ( ! [compositionTrackVideoA insertTimeRange:_videoFullRange ofTrack:videoATrack atTime:kCMTimeZero error:&error] )
NSLog(@"%@", error);

if ( ! [compositionTrackVideoB insertTimeRange:_videoFullRange ofTrack:videoBTrack atTime:kCMTimeZero error:&error] )
NSLog(@"%@", error);

if ( ! [compositionTrackAudio insertTimeRange:_videoFullRange ofTrack:audioTrack atTime:kCMTimeZero error:&error] )
NSLog(@"%@", error);

_compositionTrackVideoA = [compositionTrackVideoA copy];
_compositionTrackVideoB = [compositionTrackVideoB copy];

AVPlayerItem * playerItem = [AVPlayerItem playerItemWithAsset:composition];

AVPlayer * player = [AVPlayer playerWithPlayerItem:playerItem];

[(AVPlayerLayer *)self.layer setPlayer:player];

[player play];

[player addObserver:self forKeyPath:@"status" options:0 context:0];

[self updateCompositionForPlayerItem:playerItem];

UITapGestureRecognizer * tapGesture = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(didTap:)];
[self addGestureRecognizer:tapGesture];
}
return self;
}

- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
if ( [keyPath isEqualToString:@"status"] )
NSLog(@"STATUS %d", ((AVPlayer *)object).status );
}

- (void)updateCompositionForPlayerItem:(AVPlayerItem *)playerItem
{

AVMutableVideoComposition * videoComposition = [AVMutableVideoComposition videoComposition];

AVMutableVideoCompositionInstruction *videoInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
videoInstruction.enablePostProcessing = NO;
videoInstruction.timeRange = _videoFullRange;

AVMutableVideoCompositionLayerInstruction * layerInstructionA = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:_compositionTrackVideoA];
CGAffineTransform transformA = CGAffineTransformMakeScale(0.5, 0.5);
[layerInstructionA setTransform:transformA atTime:kCMTimeZero];
AVMutableVideoCompositionLayerInstruction * layerInstructionB = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:_compositionTrackVideoB];
CGAffineTransform transformB = CGAffineTransformMakeScale(0.5, 0.5);
static int i = 0;
transformB = CGAffineTransformTranslate(transformB, (i++ % 2 == 0) ? _videoSize.width : 0, _videoSize.height);
[layerInstructionB setTransform:transformB atTime:kCMTimeZero];

videoInstruction.layerInstructions = [NSArray arrayWithObjects:layerInstructionA, layerInstructionB, nil];

videoComposition.instructions = [NSArray arrayWithObject:videoInstruction];

videoComposition.frameDuration = CMTimeMake(1, 30); // 30 fps
videoComposition.renderSize = _videoSize;

playerItem.videoComposition = videoComposition;

}

- (void)didTap:(UITapGestureRecognizer *)tapGesture
{
[self updateCompositionForPlayerItem:((AVPlayerLayer *)self.layer).player.currentItem];
}

@end

最佳答案

您可以保存您想要更改它的时间,并用新的视频组合替换播放器项目,并从您停止播放的时间开始使用新的播放器项目重新启动播放器。

关于objective-c - 在播放期间更改 AVPlayerItem 的 videoComposition 属性 (AVMutableVideoComposition),我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/10663736/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com