gpt4 book ai didi

ios - 使用 AVMutableComposition iPhone

转载 作者:塔克拉玛干 更新时间:2023-11-02 10:15:50 25 4
gpt4 key购买 nike

我正在使用以下代码按顺序播放两个视频。但它没有在模拟器中显示任何视频,它完全空白。

还有我如何通过这两个视频进行搜索。比如,如果一个视频是 2 分钟,第二个是 3 分钟。现在我需要获取这些视频的总时间并通过它们进行搜索。当我将 slider 滑动到 4 分钟时,第二个视频应该从第 2 分钟开始播放。

这可能吗?

- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.

NSURL *url1 = [NSURL URLWithString:@"http://www.tools4movies.com/dvd_catalyst_profile_samples/Harold%20Kumar%203%20Christmas%20bionic.mp4"];
NSURL *url2 = [NSURL URLWithString:@"http://www.tools4movies.com/dvd_catalyst_profile_samples/Harold%20Kumar%203%20Christmas%20tablet.mp4"];

NSDictionary *options = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];

AVMutableComposition *composition = [[AVMutableComposition alloc] init];

asset1 = [[AVURLAsset alloc] initWithURL:url1 options:options];
AVURLAsset * asset2 = [[AVURLAsset alloc]initWithURL:url2 options:options];

CMTime insertionPoint = kCMTimeZero;
NSError * error = nil;
composition = [AVMutableComposition composition];

if (![composition insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset1.duration)
ofAsset:asset1
atTime:insertionPoint
error:&error])
{
NSLog(@"error: %@",error);
}

insertionPoint = CMTimeAdd(insertionPoint, asset1.duration);

if (![composition insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset2.duration)
ofAsset:asset2
atTime:insertionPoint
error:&error])
{
NSLog(@"error: %@",error);
}

AVPlayerItem * item = [[AVPlayerItem alloc] initWithAsset:composition];
player = [AVPlayer playerWithPlayerItem:item];
AVPlayerLayer * layer = [AVPlayerLayer playerLayerWithPlayer:player];

[layer setFrame:CGRectMake(0, 0, 320, 480)];
[[[self view] layer] addSublayer:layer];
[player play];
}

谁能告诉我我的代码有什么错误?

最佳答案

模拟器无法显示视频。内置的 UIImagePickerController 和任何视频 Controller 都无法工作。它没有实现,并且在 iOS 模拟器上大多显示为黑色或红色。您必须在 iOS 目标上进行调试。有时调试将无法正常工作。使用 NSLog() 代替。这将始终有效(即,如果您使用“发布”代码在没有调试信息的情况下进行编译)

您可以使用播放器搜索:

如果 mp 是您的媒体播放器:

[mp pause];
CMTime position = mp.currentTime;

// maybe replace something
[mp replaceCurrentItemWithPlayerItem:[AVPlayerItem playerItemWithAsset:self.composition]];

[mp seekToTime:length];
[mp play];

总结:
编辑:使用组合和播放器项目
寻找:使用播放器

这是一个简短的正式示例,说明如何执行此操作(并且已经是线程安全的):

AVMutableComposition *_composition = [AVMutableComposition composition];

// iterate though all files
// And build mutable composition
for (int i = 0; i < filesCount; i++) {

AVURLAsset* sourceAsset = nil;

NSURL* movieURL = [NSURL fileURLWithPath:[paths objectAtIndex:i]];
sourceAsset = [AVURLAsset URLAssetWithURL:movieURL options:nil];

// calculate time
CMTimeRange editRange = CMTimeRangeMake(CMTimeMake(0, 600), sourceAsset.duration);

NSError *editError;
BOOL result = [_composition insertTimeRange:editRange
ofAsset:sourceAsset
atTime:_composition.duration
error:&editError];

dispatch_sync(dispatch_get_main_queue(), ^{

// maybe you need a progress bar
self.loaderBar.progress = (float) i / filesCount;
[self.loaderBar setNeedsDisplay];
});

}

// make the composition threadsafe if you need it later
self.composition = [[_composition copy] autorelease];

// Player wants mainthread?
dispatch_sync(dispatch_get_main_queue(), ^{

mp = [AVPlayer playerWithPlayerItem:[[[AVPlayerItem alloc] initWithAsset:self.composition] autorelease]];

self.observer = [mp addPeriodicTimeObserverForInterval:CMTimeMake(60, 600) queue:nil usingBlock:^(CMTime time){

// this is our callback block to set the progressbar
if (mp.status == AVPlayerStatusReadyToPlay) {

float actualTime = time.value / time.timescale;

// avoid division by zero
if (time.value > 0.) {

CMTime length = mp.currentItem.asset.duration;
float lengthTime = length.value / length.timescale;

if (lengthTime) {

self.progressBar.value = actualTime / lengthTime;
} else {

self.progressBar.value = 0.0f;
}
}];
});

// the last task must be on mainthread again
dispatch_sync(dispatch_get_main_queue(), ^{

// create our playerLayer
self.playerLayer = [AVPlayerLayer playerLayerWithPlayer:mp];
self.playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
self.playerLayer.frame = [self view].layer.bounds;

// insert into our view (make it visible)
[[self view].layer insertSublayer:self.playerLayer atIndex:0];
});

// and now do the playback, maybe mp is global (self.mp)
// this depends on your needs
[mp play];
});

希望对您有所帮助。

关于ios - 使用 AVMutableComposition iPhone,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/10383493/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com