gpt4 book ai didi

ios - 使用 AVMutableComposition 合并视频时出现空白帧

转载 作者:可可西里 更新时间:2023-11-01 03:19:25 25 4
gpt4 key购买 nike

这个问题之前已经被问过很多次了,但没有任何帮助。我正在使用 AVMutableComposition 合并多个视频。合并视频后,我在 30% - 40% 的视频中出现空白帧。其他人合并得很好。我只是使用 AVPlayer 作为 AVPlayerItem 直接播放合成。代码如下:

AVMutableComposition *mutableComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *videoCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *audioCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];

NSMutableArray *instructions = [NSMutableArray new];
CGSize size = CGSizeZero;

CMTime time = kCMTimeZero;
for (AVURLAsset *asset in assets)
{
AVAssetTrack *assetTrack;
assetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVAssetTrack *audioAssetTrack = [asset tracksWithMediaType:AVMediaTypeAudio].firstObject;

NSError *error;
[videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetTrack.timeRange.duration )
ofTrack:assetTrack
atTime:time
error:&error];


if (error) {
NSLog(@"asset url :: %@",assetTrack.asset);
NSLog(@"Error - %@", error.debugDescription);
}

[audioCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetTrack.timeRange.duration)
ofTrack:audioAssetTrack
atTime:time
error:&error];


if (error) {
NSLog(@"Error - %@", error.debugDescription);
}
AVMutableVideoCompositionInstruction *videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
videoCompositionInstruction.timeRange = CMTimeRangeMake(time, assetTrack.timeRange.duration);
videoCompositionInstruction.layerInstructions = @[[AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack]];
[instructions addObject:videoCompositionInstruction];

time = CMTimeAdd(time, assetTrack.timeRange.duration);

if (CGSizeEqualToSize(size, CGSizeZero)) {
size = assetTrack.naturalSize;;
}
}

AVMutableVideoComposition *mutableVideoComposition = [AVMutableVideoComposition videoComposition];
mutableVideoComposition.instructions = instructions;
mutableVideoComposition.frameDuration = CMTimeMake(1, 30);
mutableVideoComposition.renderSize = size;

playerItem = [AVPlayerItem playerItemWithAsset:mutableComposition];
playerItem.videoComposition = mutableVideoComposition;

最佳答案

据我所知,AVMutableVideoCompositionLayerInstruction 不能像您的代码方式那样简单地“附加”或“添加”。

从你的代码来看,我猜你想在合并视频 Assets 时保留视频指令信息,但指令不能直接“复制”。

如果你想这样做,请参阅 AVVideoCompositionLayerInstruction 的文档,例如

    getTransformRampForTime:startTransform:endTransform:timeRange:
setTransformRampFromStartTransform:toEndTransform:timeRange:
setTransform:atTime:

getOpacityRampForTime:startOpacity:endOpacity:timeRange:
setOpacityRampFromStartOpacity:toEndOpacity:timeRange:
setOpacity:atTime:

getCropRectangleRampForTime:startCropRectangle:endCropRectangle:timeRange:
setCropRectangleRampFromStartCropRectangle:toEndCropRectangle:timeRange:
setCropRectangle:atTime:

您应该在源轨道上使用 getFoo... 方法,然后为最终轨道计算 insertTimetimeRange,然后 setFoo...,然后附加到最终 videoComposition 的 layerInstructions。

是的,有点复杂...此外,最重要的是,您无法获得适用于源 Assets 的所有视频效果。

那么你的目的是什么?你的源 Assets 是用什么支持的?

如果你只是想合并一些 mp4/mov 文件,只需循环轨道并将它们附加到 AVMutableCompositionTrack,而不是 videoComposition。我测试了您的代码,它有效。

如果要合并带有视频说明的 AVAssets,请参见上面的说明和 docs .我的最佳做法是,在合并之前,使用 AVAssetExportSession 将这些 AVAssets 保存到文件中,然后合并视频文件。

附注也许您的测试文件或源 Assets 存在一些问题。

我的项目代码,例如 Vine:

    - (BOOL)generateComposition
{
[self cleanComposition];

NSUInteger segmentsCount = self.segmentsCount;
if (0 == segmentsCount) {
return NO;
}

AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableVideoComposition *videoComposition = nil;
AVMutableVideoCompositionInstruction *videoCompositionInstruction = nil;
AVMutableVideoCompositionLayerInstruction *videoCompositionLayerInstruction = nil;
AVMutableAudioMix *audioMix = nil;

AVMutableCompositionTrack *videoTrack = nil;
AVMutableCompositionTrack *audioTrack = nil;
AVMutableCompositionTrack *musicTrack = nil;
CMTime currentTime = kCMTimeZero;

for (MVRecorderSegment *segment in self.segments) {
AVURLAsset *asset = segment.asset;
NSArray *videoAssetTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
NSArray *audioAssetTracks = [asset tracksWithMediaType:AVMediaTypeAudio];

CMTime maxBounds = kCMTimeInvalid;

CMTime videoTime = currentTime;
for (AVAssetTrack *videoAssetTrack in videoAssetTracks) {
if (!videoTrack) {
videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
videoTrack.preferredTransform = CGAffineTransformIdentity;

videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
videoCompositionLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
}

/* Fix orientation */
CGAffineTransform transform = videoAssetTrack.preferredTransform;
if (AVCaptureDevicePositionFront == segment.cameraPosition) {
transform = CGAffineTransformMakeTranslation(self.config.videoSize, 0);
transform = CGAffineTransformScale(transform, -1.0, 1.0);
} else if (AVCaptureDevicePositionBack == segment.cameraPosition) {

}
[videoCompositionLayerInstruction setTransform:transform atTime:videoTime];

/* Append track */
videoTime = [MVHelper appendAssetTrack:videoAssetTrack toCompositionTrack:videoTrack atTime:videoTime withBounds:maxBounds];
maxBounds = videoTime;
}

if (self.sessionConfiguration.originalVoiceOn) {
CMTime audioTime = currentTime;
for (AVAssetTrack *audioAssetTrack in audioAssetTracks) {
if (!audioTrack) {
audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
}
audioTime = [MVHelper appendAssetTrack:audioAssetTrack toCompositionTrack:audioTrack atTime:audioTime withBounds:maxBounds];
}
}

currentTime = composition.duration;
}

if (videoCompositionInstruction && videoCompositionLayerInstruction) {
videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration);
videoCompositionInstruction.layerInstructions = @[videoCompositionLayerInstruction];

videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.renderSize = CGSizeMake(self.config.videoSize, self.config.videoSize);
videoComposition.frameDuration = CMTimeMake(1, self.config.videoFrameRate);
videoComposition.instructions = @[videoCompositionInstruction];
}


// 添加背景音乐 musicTrack
NSURL *musicFileURL = self.sessionConfiguration.musicFileURL;
if (musicFileURL && musicFileURL.isFileExists) {
AVAsset *musicAsset = [AVAsset assetWithURL:musicFileURL];
AVAssetTrack *musicAssetTrack = [musicAsset tracksWithMediaType:AVMediaTypeAudio].firstObject;
if (musicAssetTrack) {
musicTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
if (CMTIME_COMPARE_INLINE(musicAsset.duration, >=, composition.duration)) {
// 如果背景音乐时长大于视频总时长, 则直接添加
[musicTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, composition.duration) ofTrack:musicAssetTrack atTime:kCMTimeZero error:NULL];
} else {
// 否则, 循环背景音乐
CMTime musicTime = kCMTimeZero;
CMTime bounds = composition.duration;
while (true) {
musicTime = [MVHelper appendAssetTrack:musicAssetTrack toCompositionTrack:musicTrack atTime:musicTime withBounds:bounds];
if (CMTIME_COMPARE_INLINE(musicTime, >=, composition.duration)) {
break;
}
}
}
}
}

// 处理音频
if (musicTrack) {
AVMutableAudioMixInputParameters *audioMixParameters = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:musicTrack];

/* 背景音乐添加淡入淡出 */
AVAsset *musicAsset = musicTrack.asset;
CMTime crossfadeDuration = CMTimeMake(15, 10); // 前后都是1.5秒
CMTime halfDuration = CMTimeMultiplyByFloat64(musicAsset.duration, 0.5);
crossfadeDuration = CMTimeMinimum(crossfadeDuration, halfDuration);
CMTimeRange crossfadeRangeBegin = CMTimeRangeMake(kCMTimeZero, crossfadeDuration);
CMTimeRange crossfadeRangeEnd = CMTimeRangeMake(CMTimeSubtract(musicAsset.duration, crossfadeDuration), crossfadeDuration);
[audioMixParameters setVolumeRampFromStartVolume:0.0 toEndVolume:self.sessionConfiguration.musicVolume timeRange:crossfadeRangeBegin];
[audioMixParameters setVolumeRampFromStartVolume:self.sessionConfiguration.musicVolume toEndVolume:0.0 timeRange:crossfadeRangeEnd];

audioMix = [AVMutableAudioMix audioMix];
[audioMix setInputParameters:@[audioMixParameters]];
}

_composition = composition;
_videoComposition = videoComposition;
_audioMix = audioMix;

return YES;
}


- (AVPlayerItem *)playerItem
{
AVPlayerItem *playerItem = nil;
if (self.composition) {
playerItem = [AVPlayerItem playerItemWithAsset:self.composition];
if (!self.videoComposition.animationTool) {
playerItem.videoComposition = self.videoComposition;
}
playerItem.audioMix = self.audioMix;
}
return playerItem;
}

///=============================================
/// MVHelper
///=============================================

+ (CMTime)appendAssetTrack:(AVAssetTrack *)track toCompositionTrack:(AVMutableCompositionTrack *)compositionTrack atTime:(CMTime)atTime withBounds:(CMTime)bounds
{
CMTimeRange timeRange = track.timeRange;
atTime = CMTimeAdd(atTime, timeRange.start);

if (!track || !compositionTrack) {
return atTime;
}

if (CMTIME_IS_VALID(bounds)) {
CMTime currentBounds = CMTimeAdd(atTime, timeRange.duration);
if (CMTIME_COMPARE_INLINE(currentBounds, >, bounds)) {
timeRange = CMTimeRangeMake(timeRange.start, CMTimeSubtract(timeRange.duration, CMTimeSubtract(currentBounds, bounds)));
}
}
if (CMTIME_COMPARE_INLINE(timeRange.duration, >, kCMTimeZero)) {
NSError *error = nil;
[compositionTrack insertTimeRange:timeRange ofTrack:track atTime:atTime error:&error];
if (error) {
MVLog(@"Failed to append %@ track: %@", compositionTrack.mediaType, error);
}
return CMTimeAdd(atTime, timeRange.duration);
}

return atTime;
}

关于ios - 使用 AVMutableComposition 合并视频时出现空白帧,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/30371680/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com