- Java 双重比较
- java - 比较器与 Apache BeanComparator
- Objective-C 完成 block 导致额外的方法调用?
- database - RESTful URI 是否应该公开数据库主键?
我使用下面的代码在视频上添加图像叠加层,然后将新生成的视频导出到文档目录。但奇怪的是,视频会旋转 90 度。
- (void)buildTransitionComposition:(AVMutableComposition *)composition andVideoComposition:(AVMutableVideoComposition *)videoComposition
{
CMTime nextClipStartTime = kCMTimeZero;
NSInteger i;
// Make transitionDuration no greater than half the shortest clip duration.
CMTime transitionDuration = self.transitionDuration;
for (i = 0; i < [_clips count]; i++ ) {
NSValue *clipTimeRange = [_clipTimeRanges objectAtIndex:i];
if (clipTimeRange) {
CMTime halfClipDuration = [clipTimeRange CMTimeRangeValue].duration;
halfClipDuration.timescale *= 2; // You can halve a rational by doubling its denominator.
transitionDuration = CMTimeMinimum(transitionDuration, halfClipDuration);
}
}
// Add two video tracks and two audio tracks.
AVMutableCompositionTrack *compositionVideoTracks[2];
AVMutableCompositionTrack *compositionAudioTracks[2];
compositionVideoTracks[0] = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
compositionVideoTracks[1] = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
compositionAudioTracks[0] = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
compositionAudioTracks[1] = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
CMTimeRange *passThroughTimeRanges = alloca(sizeof(CMTimeRange) * [_clips count]);
CMTimeRange *transitionTimeRanges = alloca(sizeof(CMTimeRange) * [_clips count]);
// Place clips into alternating video & audio tracks in composition, overlapped by transitionDuration.
for (i = 0; i < [_clips count]; i++ ) {
NSInteger alternatingIndex = i % 2; // alternating targets: 0, 1, 0, 1, ...
AVURLAsset *asset = [_clips objectAtIndex:i];
NSValue *clipTimeRange = [_clipTimeRanges objectAtIndex:i];
CMTimeRange timeRangeInAsset;
if (clipTimeRange)
timeRangeInAsset = [clipTimeRange CMTimeRangeValue];
else
timeRangeInAsset = CMTimeRangeMake(kCMTimeZero, [asset duration]);
AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[compositionVideoTracks[alternatingIndex] insertTimeRange:timeRangeInAsset ofTrack:clipVideoTrack atTime:nextClipStartTime error:nil];
/*
CGAffineTransform t = clipVideoTrack.preferredTransform;
NSLog(@"Transform1 : %@",t);
*/
AVAssetTrack *clipAudioTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[compositionAudioTracks[alternatingIndex] insertTimeRange:timeRangeInAsset ofTrack:clipAudioTrack atTime:nextClipStartTime error:nil];
// Remember the time range in which this clip should pass through.
// Every clip after the first begins with a transition.
// Every clip before the last ends with a transition.
// Exclude those transitions from the pass through time ranges.
passThroughTimeRanges[i] = CMTimeRangeMake(nextClipStartTime, timeRangeInAsset.duration);
if (i > 0) {
passThroughTimeRanges[i].start = CMTimeAdd(passThroughTimeRanges[i].start, transitionDuration);
passThroughTimeRanges[i].duration = CMTimeSubtract(passThroughTimeRanges[i].duration, transitionDuration);
}
if (i+1 < [_clips count]) {
passThroughTimeRanges[i].duration = CMTimeSubtract(passThroughTimeRanges[i].duration, transitionDuration);
}
// The end of this clip will overlap the start of the next by transitionDuration.
// (Note: this arithmetic falls apart if timeRangeInAsset.duration < 2 * transitionDuration.)
nextClipStartTime = CMTimeAdd(nextClipStartTime, timeRangeInAsset.duration);
nextClipStartTime = CMTimeSubtract(nextClipStartTime, transitionDuration);
// Remember the time range for the transition to the next item.
transitionTimeRanges[i] = CMTimeRangeMake(nextClipStartTime, transitionDuration);
}
// Set up the video composition if we are to perform crossfade or push transitions between clips.
NSMutableArray *instructions = [NSMutableArray array];
// Cycle between "pass through A", "transition from A to B", "pass through B", "transition from B to A".
for (i = 0; i < [_clips count]; i++ ) {
NSInteger alternatingIndex = i % 2; // alternating targets
// Pass through clip i.
AVMutableVideoCompositionInstruction *passThroughInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
passThroughInstruction.timeRange = passThroughTimeRanges[i];
AVMutableVideoCompositionLayerInstruction *passThroughLayer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTracks[alternatingIndex]];
/*
CGAffineTransform rotationTransform = CGAffineTransformMakeRotation(M_PI_2);
CGAffineTransform rotateTranslate = CGAffineTransformTranslate(rotationTransform,320,0);
[passThroughLayer setTransform:rotateTranslate atTime:kCMTimeZero];
*/
passThroughInstruction.layerInstructions = [NSArray arrayWithObject:passThroughLayer];
[instructions addObject:passThroughInstruction];
if (i+1 < [_clips count]) {
// Add transition from clip i to clip i+1.
AVMutableVideoCompositionInstruction *transitionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
transitionInstruction.timeRange = transitionTimeRanges[i];
AVMutableVideoCompositionLayerInstruction *fromLayer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTracks[alternatingIndex]];
AVMutableVideoCompositionLayerInstruction *toLayer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTracks[1-alternatingIndex]];
if (self.transitionType == SimpleEditorTransitionTypeCrossFade) {
// Fade out the fromLayer by setting a ramp from 1.0 to 0.0.
[fromLayer setOpacityRampFromStartOpacity:1.0 toEndOpacity:0.0 timeRange:transitionTimeRanges[i]];
}
else if (self.transitionType == SimpleEditorTransitionTypePush) {
// Set a transform ramp on fromLayer from identity to all the way left of the screen.
[fromLayer setTransformRampFromStartTransform:CGAffineTransformIdentity toEndTransform:CGAffineTransformMakeTranslation(-composition.naturalSize.width, 0.0) timeRange:transitionTimeRanges[i]];
// Set a transform ramp on toLayer from all the way right of the screen to identity.
[toLayer setTransformRampFromStartTransform:CGAffineTransformMakeTranslation(+composition.naturalSize.width, 0.0) toEndTransform:CGAffineTransformIdentity timeRange:transitionTimeRanges[i]];
}
transitionInstruction.layerInstructions = [NSArray arrayWithObjects:fromLayer, toLayer, nil];
[instructions addObject:transitionInstruction];
}
}
videoComposition.instructions = instructions;
}
请帮忙,因为我无法以正确的模式导出肖像视频。感谢任何帮助。谢谢。
最佳答案
默认情况下,当您使用 AVAssetExportSession 导出视频时,视频将从其原始方向旋转。你必须应用它的转换来设置它的精确方向。你可以尝试下面的代码来做同样的事情。
- (AVMutableVideoCompositionLayerInstruction *)layerInstructionAfterFixingOrientationForAsset:(AVAsset *)inAsset
forTrack:(AVMutableCompositionTrack *)inTrack
atTime:(CMTime)inTime
{
//FIXING ORIENTATION//
AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:inTrack];
AVAssetTrack *videoAssetTrack = [[inAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
UIImageOrientation videoAssetOrientation_ = UIImageOrientationUp;
BOOL isVideoAssetPortrait_ = NO;
CGAffineTransform videoTransform = videoAssetTrack.preferredTransform;
if(videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {videoAssetOrientation_= UIImageOrientationRight; isVideoAssetPortrait_ = YES;}
if(videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {videoAssetOrientation_ = UIImageOrientationLeft; isVideoAssetPortrait_ = YES;}
if(videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) {videoAssetOrientation_ = UIImageOrientationUp;}
if(videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {videoAssetOrientation_ = UIImageOrientationDown;}
CGFloat FirstAssetScaleToFitRatio = 320.0 / videoAssetTrack.naturalSize.width;
if(isVideoAssetPortrait_) {
FirstAssetScaleToFitRatio = 320.0/videoAssetTrack.naturalSize.height;
CGAffineTransform FirstAssetScaleFactor = CGAffineTransformMakeScale(FirstAssetScaleToFitRatio,FirstAssetScaleToFitRatio);
[videolayerInstruction setTransform:CGAffineTransformConcat(videoAssetTrack.preferredTransform, FirstAssetScaleFactor) atTime:kCMTimeZero];
}else{
CGAffineTransform FirstAssetScaleFactor = CGAffineTransformMakeScale(FirstAssetScaleToFitRatio,FirstAssetScaleToFitRatio);
[videolayerInstruction setTransform:CGAffineTransformConcat(CGAffineTransformConcat(videoAssetTrack.preferredTransform, FirstAssetScaleFactor),CGAffineTransformMakeTranslation(0, 160)) atTime:kCMTimeZero];
}
[videolayerInstruction setOpacity:0.0 atTime:inTime];
return videolayerInstruction;
}
希望对您有所帮助。
AVAssetTrack *assetTrack = [[inAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVMutableCompositionTrack *mutableTrack = [mergeComposition mutableTrackCompatibleWithTrack:assetTrack];
AVMutableVideoCompositionLayerInstruction *assetInstruction = [self layerInstructionAfterFixingOrientationForAsset:inAsset forTrack:myLocalVideoTrack atTime:videoTotalDuration];
上面是调用上述方法的代码,其中 inAsset
是您的视频 Assets ,videoTotalDuration
是您在 CMTime.mergeComposition
中的视频总时长是AVMutableComposition
类的对象。
希望这会有所帮助。
编辑:这不是任何回调方法或事件,您必须使用上面提到的所需参数按预期调用它。
关于iphone - AVMutableVideoComposition 以纵向模式捕获的旋转视频,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/42899735/
我正在尝试根据本教程在不同位置同时播放 2 个视频的组合:https://abdulazeem.wordpress.com/2012/04/02/video-manipulation-in-ios-r
我的设置如下:在我的应用程序的这一部分中,用户可以将视频排成一行。这些视频之间可能有间隙,或者稍后从 0 开始。 为了实现这一目标,我添加了一个功能,该功能添加全黑视频并在每个间隙期间循环播放: fu
我使用下面的代码在视频上添加图像叠加,然后将新生成的视频导出到文档目录。但奇怪的是,视频旋转了 90 度。 - (void)buildTransitionComposition:(AVMutableC
我使用下面的代码在视频上添加图像叠加层,然后将新生成的视频导出到文档目录。但奇怪的是,视频会旋转 90 度。 - (void)buildTransitionComposition:(AVMutable
我已经成功地用多个视频剪辑组成了一个 AVMutableComposition 并且可以查看和导出它,我希望能够使用淡入淡出在它们之间进行转换,所以我想使用 AVMutableVideoComposi
我使用下面的代码在视频上添加图像叠加层,然后将新生成的视频导出到文档目录。但奇怪的是,视频会旋转 90 度。 - (void)buildTransitionComposition:(AVMutable
我是 Swift 的新手。我正在尝试引用 SO 中的代码添加水印。我的原始视频分辨率是 1280 X 720,但输出视频是缩小版。 这是前后对比图 这是我创建水印的函数。 private func w
我正在使用这个简短的片段来设置我的视频。由于某些未知的原因 - 有时视频根本不会显示,而对于其他视频则可以完美运行。 let videoTrack: AVAssetTrack = asset.trac
我有 3 个视频,我正在使用 AVMutableComposition 进行排序,然后使用 AVPlayer 播放视频并使用 AVPlayerItemVdeoOutput 抓取帧.视频顺序如下: [L
我从 livephoto 中获取视频,并多次导出这样的视频。 它变得比源 livephoto 更红。这是一个错误? 这是导出代码,最后有一个演示项目。 - (void)clipMovie:(NSStr
下面是我用来导出视频的函数: - (void) videoOutput { //1 - Early exit if there's no video file selected if (!self.v
我找到了一些展示如何在视频上添加文本叠加层的示例。 Ray 的教程 - http://www.raywenderlich.com/30200/avfoundation-tutorial-adding-
我有一个过滤 AVPlayerItem Assets 的功能。问题之一是设置视频的转换。但是,每当我设置 AVMutableVideoComposition 的 AVMutableVideoCompo
我的目标是合成一组从相机记录的剪辑,并以特定的首选大小导出它们。当然,导出前需要旋转视频方向。 我通过从存储在下面的 avAssets 中的视频剪辑数组组成一个 AVMutableCompositio
所以我正在开发这个视频编辑器项目,但在获取 trackID 时遇到了问题我搜索了我所知道的所有地方,但我所能找到的只是一些 Objective-C 代码,但我无法将其转换为 swift 我在这一行收到
我创建了一个包含多个视频剪辑的合成,我还添加了一些空白范围,用于显示带有 CoreAnimation 的标题屏幕。 |--Video1--|--NoVideo(AVsyncLayer)--|--Vid
我正在尝试通过视频导出图像,但是当导出完成时,我得到了带有图像的黑屏视频。这是设置组合的代码: UIImage *myImage = [UIImage imageNamed:@"block.png"]
当我使用 this tutorial 创建视频时我试着移动 AVAssetTrack* firstTrack (没有动画,所以我不能使用 CAAnimation s)使用 AVMutableVideo
我有一个应用程序,允许用户以可变组合录制视频。我想设置一些文本,当用户在导出后播放它时,它会出现,然后按照我设置的时间间隔进行更改。 例如,如果第一个词是“dog”,那么我想将其设置为“cat”在 X
我正在做一些视频编辑,我需要将我正在操作的 AVMutableVideoComposition 放回播放器项目中。要进入播放器项目,它需要是一个 AVAsset。如何做到这一点? 最佳答案 您可以使用
我是一名优秀的程序员,十分优秀!