gpt4 book ai didi

ios - 将带有 AVMutableVideoComposition 的文本覆盖添加到特定时间范围

转载 作者:可可西里 更新时间:2023-11-01 04:23:31 25 4
gpt4 key购买 nike

我找到了一些展示如何在视频上添加文本叠加层的示例。

Ray 的教程 - http://www.raywenderlich.com/30200/avfoundation-tutorial-adding-overlays-and-animations-to-videos

这个 SO 答案 - How can I add a watermark in a captured video on iOS

我还引用了其他一些。我的代码看起来与那个答案几乎相同,我一直在尝试调整它以使文本覆盖仅在视频开头显示一两秒。对我如何做到这一点有什么帮助吗?

这是我的代码。这按原样导出视频,并在整个持续时间内显示叠加层。

if(vA) {
videoCompositionTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[videoCompositionTrack insertTimeRange:videoTimerange ofTrack:[[vA tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:&error];
[videoCompositionTrack setPreferredTransform:[[[vA tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] preferredTransform]];
if(error)
NSLog(@"%@", error);
}

if(aA) {
audioCompositionTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[audioCompositionTrack insertTimeRange:audioTimerange ofTrack:[[aA tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:&error];
if(error)
NSLog(@"%@", error);
}

CGSize videoSize = [videoCompositionTrack naturalSize];

UIImage *myImage = [UIImage imageNamed:@"cover.png"];
CALayer *aLayer = [CALayer layer];
aLayer.contents = (id)myImage.CGImage;
aLayer.frame = CGRectMake(5, 25, 100, 56);
aLayer.opacity = 0.7;

CATextLayer *titleLayer = [CATextLayer layer];
titleLayer.string = titleText;
titleLayer.fontSize = 18;
titleLayer.foregroundColor = titleColor.CGColor;
titleLayer.alignmentMode = kCAAlignmentCenter;
titleLayer.frame = CGRectMake(20, 10, videoSize.width - 40, 20);
[titleLayer displayIfNeeded];

CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height);
videoLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height);
[parentLayer addSublayer:videoLayer];
[parentLayer addSublayer:aLayer];
if(titleText && [titleText length] > 0) {
[parentLayer addSublayer:titleLayer];
}

AVMutableVideoComposition *videoComp = [AVMutableVideoComposition videoComposition];
videoComp.renderSize = videoSize;
videoComp.frameDuration = CMTimeMake(1, 30);
videoComp.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];

AVAssetTrack *videoTrack = [[mixComposition tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];

AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
[instruction setTimeRange:CMTimeRangeMake(kCMTimeZero, [mixComposition duration])];
instruction.layerInstructions = [NSArray arrayWithObject:layerInstruction];
videoComp.instructions = [NSArray arrayWithObject:instruction];

AVAssetExportSession *_assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetMediumQuality];
_assetExport.videoComposition = videoComp;
_assetExport.outputFileType = AVFileTypeMPEG4;
_assetExport.outputURL = outputFileUrl;

[_assetExport exportAsynchronouslyWithCompletionHandler:^{
AVAssetExportSessionStatus status = [_assetExport status];
switch (status) {
case AVAssetExportSessionStatusFailed:
NSLog(@"Export Failed");
NSLog(@"Export Error: %@", [_assetExport.error localizedDescription]);
NSLog(@"Export Error Reason: %@", [_assetExport.error localizedFailureReason]);
break;
case AVAssetExportSessionStatusCompleted:
NSLog(@"Export Completed");
[self performSelectorOnMainThread:@selector(updateProgressIndicator:) withObject:[NSNumber numberWithFloat:2] waitUntilDone:YES];
break;
case AVAssetExportSessionStatusUnknown:
NSLog(@"Export Unknown");
break;
case AVAssetExportSessionStatusExporting:
NSLog(@"Export Exporting");
break;
case AVAssetExportSessionStatusWaiting:
NSLog(@"Export Waiting");
break;
}
}];

最佳答案

我想出了我需要做什么。真的没什么特别的。它只需要更好地了解一切皆有可能。

基本上我所要做的就是向其中包含文本的图层添加一个基本的不透明度动画。

// My original code for creating the text layer
CATextLayer *titleLayer = [CATextLayer layer];
.
.
.
[titleLayer displayIfNeeded];

// the code for the opacity animation which then removes the text
CABasicAnimation *animation = [CABasicAnimation animationWithKeyPath:@"opacity"];
[animation setDuration:0];
[animation setFromValue:[NSNumber numberWithFloat:1.0]];
[animation setToValue:[NSNumber numberWithFloat:0.0]];
[animation setBeginTime:1];
[animation setRemovedOnCompletion:NO];
[animation setFillMode:kCAFillModeForwards];
[titleLayer addAnimation:animation forKey:@"animateOpacity"];

关于ios - 将带有 AVMutableVideoComposition 的文本覆盖添加到特定时间范围,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/21684549/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com