gpt4 book ai didi

objective-c - 如何使用 AVFoundation 进行双轨编辑?

转载 作者:行者123 更新时间:2023-11-29 13:30:32 24 4
gpt4 key购买 nike

我正在尝试构建一个视频混搭应用程序,我需要用户能够设置 1 个具有恒定视频轨道的轨道,并有第二个轨道将充当 b-roll 轨道,它将剪切不时离开主要轨道来展示一些相关内容。我有第一条轨道,所以时间轴上的剪辑使用 AVMutableCompositionInstructions 在合成中排列在一起,但我无法理解如何在独立控制的 b-roll 轨道中工作?几天来我一直在为此苦苦挣扎!这是构建第一条轨道内容的代码,我现在将其设置为在剪辑之间浸入黑色。任何 AVFoundation 大师都可以给我提示吗?

CMTime nextClipStartTime = kCMTimeZero;
NSInteger i;
CMTime transitionDuration = CMTimeMakeWithSeconds(1,30);
AVMutableCompositionTrack *compositionVideoTrack[2];
AVMutableCompositionTrack *compositionAudioTrack[2];
compositionVideoTrack[0] = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
compositionAudioTrack[0] = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
compositionAudioTrack[1] = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
compositionVideoTrack[1] = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *bedMusicTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
i = 0;
NSMutableArray *allAudioParams = [NSMutableArray array];
AVMutableAudioMixInputParameters *audioInputParams[2];
audioInputParams[0] = [AVMutableAudioMixInputParameters audioMixInputParameters];
audioInputParams[1] = [AVMutableAudioMixInputParameters audioMixInputParameters];
[audioInputParams[0] setTrackID: compositionAudioTrack[0].trackID];
[audioInputParams[1] setTrackID: compositionAudioTrack[1].trackID];
float lastVol = 0;
NSMutableArray *instructions = [NSMutableArray array];
for(ClipInfo *info in videoLine.items){
AVAsset *asset = [AVAsset assetWithURL:info.url];
CMTimeRange timeRangeInAsset = CMTimeRangeMake(info.inTime, info.duration);
AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

[compositionVideoTrack[0] insertTimeRange:timeRangeInAsset ofTrack:clipVideoTrack atTime:nextClipStartTime error:nil];
AVAssetTrack *clipAudioTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[compositionAudioTrack[0] insertTimeRange:timeRangeInAsset ofTrack:clipAudioTrack atTime:nextClipStartTime error:nil];
if(i != 0){
[audioInputParams[0] setVolume:lastVol atTime: CMTimeSubtract(nextClipStartTime,CMTimeMakeWithSeconds(1,30))];;
}
[audioInputParams[0] setVolume:info.volume atTime:nextClipStartTime];
lastVol = info.volume;

CMTime clipStartTime = (i == 0) ? nextClipStartTime : CMTimeAdd(nextClipStartTime,transitionDuration);
CMTime clipDuration = (i == 0 || i == (videoLine.items.count - 1)) ? CMTimeSubtract(timeRangeInAsset.duration, transitionDuration) : CMTimeSubtract(timeRangeInAsset.duration, CMTimeMultiply(transitionDuration, 2));
if([videoLine.items count] == 1){
clipDuration = timeRangeInAsset.duration;
}
if(i != 0){
//trans in
AVMutableVideoCompositionInstruction *inInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
inInstruction.timeRange = CMTimeRangeMake(nextClipStartTime, transitionDuration);
AVMutableVideoCompositionLayerInstruction *fadeIn = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack[0]];
[fadeIn setOpacityRampFromStartOpacity:0 toEndOpacity:1 timeRange:CMTimeRangeMake(nextClipStartTime, transitionDuration)];
inInstruction.layerInstructions = [NSArray arrayWithObject:fadeIn];
[instructions addObject:inInstruction];
}

AVMutableVideoCompositionInstruction *passThroughInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
passThroughInstruction.timeRange = CMTimeRangeMake(clipStartTime,clipDuration);
AVMutableVideoCompositionLayerInstruction *passThroughLayer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack[0]];
passThroughInstruction.layerInstructions = [NSArray arrayWithObject:passThroughLayer];
[instructions addObject:passThroughInstruction];

if(i < (videoLine.items.count - 1)){
//fade out
AVMutableVideoCompositionInstruction *outInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
outInstruction.timeRange = CMTimeRangeMake(CMTimeAdd(clipStartTime,clipDuration), transitionDuration);
AVMutableVideoCompositionLayerInstruction *fadeOut = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack[0]];
[fadeOut setOpacityRampFromStartOpacity:1.0 toEndOpacity:0 timeRange:CMTimeRangeMake(CMTimeAdd(clipStartTime,clipDuration), transitionDuration)];
outInstruction.layerInstructions = [NSArray arrayWithObject:fadeOut];
[instructions addObject:outInstruction];
}
nextClipStartTime = CMTimeAdd(nextClipStartTime,timeRangeInAsset.duration);
if(i == ([videoLine.items count] - 1)){
[audioInputParams[0] setVolume:info.volume atTime:nextClipStartTime];
}
i++;
}

最佳答案

您需要在视频/音频组合轨道“A”和“B”之间来回切换,或者在您的代码中 compositionVideoTrack[0] 和 compositionVideoTrack[1]。循环中的每次迭代都会切换出您要合成到的轨道。

关于objective-c - 如何使用 AVFoundation 进行双轨编辑?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/12015337/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com