gpt4 book ai didi

ios - AVAssetExportSession 间歇性错误 11820 "Cannot Complete Export"Suggestion=再次尝试导出

转载 作者:塔克拉玛干 更新时间:2023-11-02 19:54:47 25 4
gpt4 key购买 nike

EXPORT STATUS 4 Error Domain=AVFoundationErrorDomain Code=-11820 "Cannot Complete Export" UserInfo={NSLocalizedDescription=Cannot Complete Export, NSLocalizedRecoverySuggestion=Try exporting again.}

尝试导出包含 AVMutableVideoCompositionLayerInstructionAVMutableComposition 和使用 AVMutableVideoComposition 时遇到间歇性错误AVAssetExportSession.

目标是合并无限数量的视频并使用 layerInstructions 在剪辑之间应用过渡。

附言错误不一致。它在尝试合并 5 个剪辑和 18 个剪辑时有效,但在尝试合并 17 个剪辑时无效。

我已经在下面发布了我的代码。非常感谢任何帮助。

编辑:问题似乎与多个 AVMutableCompositionTrack(s) 的创建有关。如果创建超过 15 或 16 个,则会发生错误。但是,我认为创建多个 AVMutableCompositionTrack 是重叠所有视频和创建重叠过渡的必要条件。

编辑 2:选择较短的视频时,会在发生错误之前处理更多的视频。因此,它看起来像是一个内存问题,轨道正在被释放。但是,基于内存管理工具似乎没有内存泄漏。

-(void)prepareMutableCompositionForPlayback{
AVMutableComposition *mutableComposition = [[AVMutableComposition alloc] init];
AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.backgroundColor = [[UIColor blackColor] CGColor];

NSMutableArray *instructionsArray = [[NSMutableArray alloc] init];

videoStartTime = kCMTimeZero;

for(int i = 0; i < videoAssetsArray.count; i++){
AVAsset *videoAsset = [videoAssetsArray objectAtIndex:i];
CMTime currentVideoDuration = [videoAsset duration];

AVMutableCompositionTrack *videoTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, currentVideoDuration) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:videoStartTime error:nil];

CGSize videoSize = [videoTrack naturalSize];

if([videoAsset tracksWithMediaType:AVMediaTypeAudio].count > 0){
AVMutableCompositionTrack *audioTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, currentVideoDuration) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:videoStartTime error:nil];
}

//INSTRUCTIONS - TRANSITIONS
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];

int transitionNumber = [[videoTransitionsArray objectAtIndex:i] intValue];
float transitionDuration = [[videoTransitionsDurationArray objectAtIndex:i] floatValue];

if(i == 0){
[layerInstruction setOpacityRampFromStartOpacity:1.0 toEndOpacity:0.0 timeRange:CMTimeRangeMake(CMTimeSubtract(currentVideoDuration, CMTimeMakeWithSeconds(transitionDuration, 600)), CMTimeMakeWithSeconds(transitionDuration, 600))];
}
else{
int previousTransitionNumber = [[videoTransitionsArray objectAtIndex:i - 1] intValue];
float previousTransitionDuration = [[videoTransitionsDurationArray objectAtIndex:i - 1] floatValue];

if(i < videoAssetsArray.count - 1){
[layerInstruction setOpacityRampFromStartOpacity:1.0 toEndOpacity:1.0 timeRange:CMTimeRangeMake(videoStartTime, CMTimeMakeWithSeconds(previousTransitionDuration, 600))];

[layerInstruction setOpacityRampFromStartOpacity:1.0 toEndOpacity:0.0 timeRange:CMTimeRangeMake(CMTimeAdd(videoStartTime, CMTimeSubtract(currentVideoDuration, CMTimeMakeWithSeconds(transitionDuration, 600))), CMTimeMakeWithSeconds(transitionDuration, 600))];
}
else{
[layerInstruction setOpacityRampFromStartOpacity:1.0 toEndOpacity:1.0 timeRange:CMTimeRangeMake(videoStartTime, CMTimeMakeWithSeconds(previousTransitionDuration, 600))];
}
}

[instructionsArray addObject:layerInstruction];

if(i < videoAssetsArray.count - 1){
//TAKING INTO ACCOUNT THE TRANSITION DURATION TO OVERLAP VIDEOS
videoStartTime = CMTimeAdd(videoStartTime, CMTimeSubtract(currentVideoDuration, CMTimeMakeWithSeconds(transitionDuration, 600)));
}
else{
//TRANSITION NOT APPLIED TO THE END OF THE LAST CLIP
videoStartTime = CMTimeAdd(videoStartTime, currentVideoDuration);
}
}

mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero,videoStartTime);
mainInstruction.layerInstructions = instructionsArray;

AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.instructions = [NSArray arrayWithObjects:mainInstruction,nil];
videoComposition.frameDuration = CMTimeMake(1, 30);
videoComposition.renderSize = CGSizeMake(1920, 1080);

NSArray* paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *videoOutputPath = [documentsDirectory stringByAppendingPathComponent:@"videoRecordingFinalOutput.mov"];
NSURL *videoOutputURL = [[NSURL alloc] initFileURLWithPath:videoOutputPath];

AVAssetExportSession *videoExportSession = [[AVAssetExportSession alloc] initWithAsset:mutableComposition presetName:AVAssetExportPresetHighestQuality];
videoExportSession.outputURL = videoOutputURL;
videoExportSession.videoComposition = videoComposition;
videoExportSession.outputFileType = AVFileTypeQuickTimeMovie;

[videoExportSession exportAsynchronouslyWithCompletionHandler:^{
NSLog(@"EXPORT STATUS %ld %@", (long)videoExportSession.status, videoExportSession.error);

if(videoExportSession.error == NULL){
NSLog(@"EXPORT SUCCESSFUL");

[library writeVideoAtPathToSavedPhotosAlbum:videoOutputURL
completionBlock:^(NSURL *assetURL, NSError *error) {
if(error) {

NSError *error = nil;
if([[NSFileManager defaultManager] fileExistsAtPath:videoOutputPath]){
[[NSFileManager defaultManager] removeItemAtPath:videoOutputPath error:&error];
if(error){
NSLog(@"VIDEO FILE DELETE FAILED");
}
else{
NSLog(@"VIDEO FILE DELETED");
}
}
}
else{
NSError *error = nil;
if([[NSFileManager defaultManager] fileExistsAtPath:videoOutputPath]){
[[NSFileManager defaultManager] removeItemAtPath:videoOutputPath error:&error];
if(error){
NSLog(@"VIDEO FILE DELETE FAILED");
}
else{
NSLog(@"VIDEO FILE DELETED");
}
}
}
}];
}
else{
NSError *error = nil;
if([[NSFileManager defaultManager] fileExistsAtPath:videoOutputPath]){
[[NSFileManager defaultManager] removeItemAtPath:videoOutputPath error:&error];
if(error){
NSLog(@"VIDEO FILE DELETE FAILED");
}
else{
NSLog(@"VIDEO FILE DELETED");
}
}
}
}];
}

最佳答案

与其为每个剪辑创建新的 videoTracks,不如尝试仅使用 2 个 videoTracks 并在这 2 个中插入 timeRanges。并在 2 个轨道之间进行过渡

所以第一个视频将被插入到 videoTrack1 中,第二个视频将被插入到 videoTrack2 中,这样就可以应用过渡,然后再次将第三个剪辑插入到轨道 1 中,依此类推。

关于ios - AVAssetExportSession 间歇性错误 11820 "Cannot Complete Export"Suggestion=再次尝试导出,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/35170799/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com