gpt4 book ai didi

ios - 使用 AVFoundation 显示来自 AVVideoCapture session 的异步 stillImageOutput

转载 作者:行者123 更新时间:2023-11-28 23:37:13 25 4
gpt4 key购买 nike

我正在预览模式下拍摄视频,并希望显示相机拍摄的静止图像。

我目前将图像和捕获输出保存到接口(interface)中定义的 ivars 中:

UIImage *snapshot
AVCaptureStillImageOutput* stillImageOutput;

视频显示正常。但是,当我 try catch 和显示静止图像时,什么也没有出现,事实上,调试器显示 stillImageOutput 和图像为零。我认为这可能是异步捕获的时间问题,我需要使用完成处理程序,但我不擅长完成处理程序。

在不占用 UI 的情况下捕获静止图像后立即显示静止图像的正确方法是什么:

捕获静止图像的代码:

- (void)takeSnapshot {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer != NULL) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
snapshot = [UIImage imageWithData:imageData];
}
}];
}

静态显示的代码。请注意缺少完成处理程序,这可能是个问题,但是,我不确定如何编写...

 [self takeSnapshot];
self.imageView.image = snapshot;

最佳答案

我将更改 takeSnapshot 方法以接受完成 block ,然后在其他异步方法的完成 block 中调用该完成 block :captureStillImageAsynchronouslyFromConnection:completionHandler

下面是一个方法的示例,该方法采用完成 block ,然后在内部调用的方法的完成 block 中回调它:

// this correlates to your takeSnapshot method
// you want to add a completion portion to this method
- (void)doSomethingAsynchronouslyWithCompletion:(void (^)(NSData *completionData))completion {
// call your other async method
[self anotherAsyncMethodWithItsOwnCompletion:^(NSData *completionDataFromSecondMethod) {
if (completionDataFromSecondMethod.length > 0) {
// this is where you would receive the CMSampleBufferRef from the completion handler of captureStillImageAsynchronouslyFromConnection:completionHandler
// and convert it over to to data
// make sure the completion block isn't nil if it's nullable
if (completion) {
// you would want to pass back the NSData imageData in the completion block here
completion(completionDataFromSecondMethod);
}
}
}];
}

// this method would simulate the captureStillImageAsynchronouslyFromConnection:completionHandler: method
- (void)anotherAsyncMethodWithItsOwnCompletion:(void (^)(NSData * completionDataFromSecondMethod))anotherCompletion {
// this is just to simulate some time waiting for the asnyc task to complete
// never call sleep in your own code
sleep(3);
if (anotherCompletion) {
// this simulates the fake CFSampleBufferRef passed back by the captureStillImage...
NSData *fakeCompletionData = [@"FakeCompletionString" dataUsingEncoding:NSUTF8StringEncoding];
anotherCompletion(fakeCompletionData);
}
}

以及如何调用它的示例:

    [self doSomethingAsynchronouslyWithCompletion:^(NSData *completionData) {
if (completionData.length > 0) {
// come back on the main queue to modify any UI Elements
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
// this is where you want want to set your self.imageView.image
// self.imageView.image = [UIImage imageWithData:{{dataFromCompletion}}]
NSLog(@"The completionString result = %@", [[NSString alloc] initWithData:completionData encoding:NSUTF8StringEncoding]);
}];
}
}];

此链接可能有助于您开始使用 block 语法:http://goshdarnblocksyntax.com

关于ios - 使用 AVFoundation 显示来自 AVVideoCapture session 的异步 stillImageOutput,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/54601663/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com