gpt4 book ai didi

ios - 使用 AVFoundation 时对什么对象实际包含捕获的图像感到困惑

转载 作者:行者123 更新时间:2023-11-28 22:11:56 25 4
gpt4 key购买 nike

我有一个使用 AVFoundation 的拍照应用程序。到目前为止一切正常。

然而,真正让我困惑的一件事是,捕获的图像实际上包含在什么对象中?

我一直在 NSLogging 所有对象和它们的一些属性,但我仍然无法弄清楚捕获的图像包含在哪里。

这是我设置捕获 session 的代码:

self.session =[[AVCaptureSession alloc]init];


[self.session setSessionPreset:AVCaptureSessionPresetPhoto];



self.inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];


NSError *error;


self.deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:self.inputDevice error:&error];




if([self.session canAddInput:self.deviceInput])
[self.session addInput:self.deviceInput];



self.previewLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:self.session];


self.rootLayer = [[self view]layer];


[self.rootLayer setMasksToBounds:YES];



[self.previewLayer setFrame:CGRectMake(0, 0, self.rootLayer.bounds.size.width, self.rootLayer.bounds.size.height)];


[self.previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];



[self.rootLayer insertSublayer:self.previewLayer atIndex:0];


self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];


[self.session addOutput:self.stillImageOutput];

[self.session startRunning];


}

然后这是我在用户按下拍摄按钮时拍摄静止图像的代码:

-(IBAction)stillImageCapture {




AVCaptureConnection *videoConnection = nil;

videoConnection.videoOrientation = AVCaptureVideoOrientationPortrait;


for (AVCaptureConnection *connection in self.stillImageOutput.connections){
for (AVCaptureInputPort *port in [connection inputPorts]){

if ([[port mediaType] isEqual:AVMediaTypeVideo]){

videoConnection = connection;



break;
}
}
if (videoConnection) {
break;
}
}




[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {

[self.session stopRunning];


}

];}

当用户按下捕获按钮并执行上述代码时,捕获的图像成功显示在 iPhone 屏幕上,但我无法弄清楚哪个对象实际持有捕获的图像。

感谢您的帮助。

最佳答案

CMSampleBuffer是实际包含图像的内容。

在您的 captureStillImageAsynchronouslyFromConnection 完成处理程序中,您需要如下内容:

NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage* capturedImage = [[UIImage alloc] initWithData:imageData];

我的工作实现:

- (void)captureStillImage
{
@try {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in _stillImageOutput.connections){
for (AVCaptureInputPort *port in [connection inputPorts]){

if ([[port mediaType] isEqual:AVMediaTypeVideo]){

videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
NSLog(@"About to request a capture from: %@", [self stillImageOutput]);
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) {

// This is here for when we need to implement Exif stuff.
//CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);

NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];

// Create a UIImage from the sample buffer data
_capturedImage = [[UIImage alloc] initWithData:imageData];


BOOL autoSave = YES;
if (autoSave)
{
UIImageWriteToSavedPhotosAlbum(_capturedImage, self, @selector(image:didFinishSavingWithError:contextInfo:), nil);
}

}];
}
@catch (NSException *exception) {
NSlog(@"ERROR: Unable to capture still image from AVFoundation camera: %@", exception);
}
}

关于ios - 使用 AVFoundation 时对什么对象实际包含捕获的图像感到困惑,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/22700530/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com