gpt4 book ai didi

ios - 使用 AVAssetExportPresetHighestQuality 时无法合成视频

转载 作者:行者123 更新时间:2023-12-01 16:08:07 25 4
gpt4 key购买 nike

我正在尝试创建一个将多个视频拼接在一起的应用程序。问题似乎是当我将指令与 AVAssetExportPresetHighestQuality 结合使用时我收到一条错误消息,指出

Export failed -> Reason: The video could not be composed., User Info: { NSLocalizedDescription = "Operation Stopped"; NSLocalizedFailureReason = "The video could not be composed."; NSUnderlyingError = "Error Domain=NSOSStatusErrorDomain Code=-17390 \"(null)\""; }



如果我将其更改为 AVAssetExportPresetPassthrough它工作正常,但指令被忽略。有谁知道使用以下代码可能是什么问题。我快到了,但这个问题阻碍了我。
AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];

AVMutableCompositionTrack *track = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];

AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];

CMTime insertTime = kCMTimeZero;

NSMutableArray *arrayInstructions = [[NSMutableArray alloc] init];

int i = 0;

for (NSMutableDictionary * dict in self.arraySelectedAssets) {

AVAsset *asset = [dict objectForKey:@"avasset"];

//[self orientationForTrack:asset];

AVAssetTrack* videoAssetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVAssetTrack* audioAssetTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];

[track insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:videoAssetTrack atTime:insertTime error:nil];

[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:audioAssetTrack atTime:insertTime error:nil];

AVMutableVideoCompositionInstruction *firstVideoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
// Set the time range of the first instruction to span the duration of the first video track.
firstVideoCompositionInstruction.timeRange = CMTimeRangeMake(insertTime, videoAssetTrack.timeRange.duration);


AVMutableVideoCompositionLayerInstruction* firstVideoLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]];
CGAffineTransform translateToCenter = CGAffineTransformMakeTranslation( 0,-1334);
CGAffineTransform rotateBy90Degrees = CGAffineTransformMakeRotation( M_PI_2);
CGAffineTransform shrinkWidth = CGAffineTransformMakeScale(0.1, 0.1); // needed because Apple does a "stretch" by default - really, we should find and undo apple's stretch - I suspect it'll be a CALayer defaultTransform, or UIView property causing this
CGAffineTransform finalTransform = CGAffineTransformConcat( shrinkWidth, CGAffineTransformConcat(translateToCenter, rotateBy90Degrees) );
[firstVideoLayerInstruction setTransform:finalTransform atTime:kCMTimeZero];

firstVideoCompositionInstruction.layerInstructions = @[firstVideoLayerInstruction];

[arrayInstructions addObject:firstVideoCompositionInstruction];

insertTime = CMTimeAdd(insertTime, videoAssetTrack.timeRange.duration);

i = i + 1;

}

AVMutableVideoComposition *mutableVideoComposition = [AVMutableVideoComposition videoComposition];
mutableVideoComposition.instructions = arrayInstructions;
mutableVideoComposition.renderSize = CGSizeMake(1334, 750);
mutableVideoComposition.frameDuration = CMTimeMake(1, 30);


// 4 - Get path
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:
[NSString stringWithFormat:@"mergeVideo-%d.mov",arc4random() % 1000]];
self.combinedVideoURL = [NSURL fileURLWithPath:myPathDocs];

// 5 - Create exporter

self.timerExporter = [NSTimer scheduledTimerWithTimeInterval:0.01f
target:self
selector:@selector(exporterProgress)
userInfo:nil
repeats:YES];

// 5 - Create exporter
self.exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetHighestQuality];
self.exporter .outputURL=self.combinedVideoURL;
self.exporter .outputFileType = AVFileTypeQuickTimeMovie;
self.exporter .shouldOptimizeForNetworkUse = YES;
self.exporter.videoComposition = mutableVideoComposition;
[self.exporter exportAsynchronouslyWithCompletionHandler:^{

[self.timerExporter invalidate];

switch (self.exporter.status) {
case AVAssetExportSessionStatusFailed:
NSLog(@"Export failed -> Reason: %@, User Info: %@",
self.exporter.error.localizedFailureReason,
self.exporter.error.userInfo.description);
[self showError:self.exporter.error.localizedFailureReason];
break;
case AVAssetExportSessionStatusCancelled:
NSLog(@"Export cancelled");
break;

case AVAssetExportSessionStatusCompleted:
NSLog(@"Export finished");

dispatch_async(dispatch_get_main_queue(), ^{
self.labelProgressText.text = [NSString stringWithFormat:@"%@ (100%%)", NSLocalizedString(@"Combining The Videos", nil)];
[self applyTheFilter];
});

break;

}

}];

最佳答案

恐怕这不是你要找的答案。我在转换和导出单个视频时遇到了同样的问题 - AVAssetExportPresetHighestQuality将适用于某些 Assets ,而不适用于其他 Assets 。

我当时的猜测是,不起作用的资源的大小/帧率/质量不够高,无法使用 AVAssetExportPresetHighestQuality 进行渲染。 .

正如你所做的,我最终使用了 AVAssetExportPresetPassthrough .在您的情况下,最终结果可能是您拼接在一起的所有 Assets 都将以原始格式呈现。

关于ios - 使用 AVAssetExportPresetHighestQuality 时无法合成视频,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/52093511/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com