gpt4 book ai didi

iPhone SDK 4 AVFoundation - 如何正确使用 captureStillImageAsynchronouslyFromConnection?

转载 作者:行者123 更新时间:2023-12-03 18:19:02 24 4
gpt4 key购买 nike

我正在尝试使用新的 AVFoundation 框架 使用 iPhone 拍摄静态照片。

按下按钮即可调用此方法。我可以听到快门声,但看不到日志输出。如果我多次调用此方法,相机预览将卡住。

有没有关于如何使用captureStillImageAsynchronouslyFromConnection的教程?

[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:
[[self stillImageOutput].connections objectAtIndex:0]
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer,
NSError *error) {
NSLog(@"inside");
}];
- (void)initCapture {    AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput                                           deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]                                           error:nil];    AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];    captureOutput.alwaysDiscardsLateVideoFrames = YES;     dispatch_queue_t queue;    queue = dispatch_queue_create("cameraQueue", NULL);    [captureOutput setSampleBufferDelegate:self queue:queue];    dispatch_release(queue);    NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;     NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];     NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];     [captureOutput setVideoSettings:videoSettings];     self.captureSession = [[AVCaptureSession alloc] init];    self.captureSession.sessionPreset = AVCaptureSessionPresetLow;    [self.captureSession addInput:captureInput];    [self.captureSession addOutput:captureOutput];    self.prevLayer = [AVCaptureVideoPreviewLayer layerWithSession: self.captureSession];    [self.prevLayer setOrientation:AVCaptureVideoOrientationLandscapeLeft];    self.prevLayer.frame = CGRectMake(0.0, 0.0, 480.0, 320.0);    self.prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;    [self.view.layer addSublayer: self.prevLayer];    // Setup the default file outputs    AVCaptureStillImageOutput *_stillImageOutput = [[[AVCaptureStillImageOutput alloc] init] autorelease];    NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:                                    AVVideoCodecJPEG, AVVideoCodecKey,                                    nil];    [_stillImageOutput setOutputSettings:outputSettings];    [outputSettings release];    [self setStillImageOutput:_stillImageOutput];       if ([self.captureSession canAddOutput:stillImageOutput]) {        [self.captureSession addOutput:stillImageOutput];    }    [self.captureSession commitConfiguration];    [self.captureSession startRunning];}

最佳答案

经过多次尝试和错误,我找到了如何做到这一点。

提示:Apple 的官方文档完全是错误的。他们给你的代码实际上不起作用。

我在这里写了它并附有分步说明:

http://red-glasses.com/index.php/tutorials/ios4-take-photos-with-live-video-preview-using-avfoundation/

链接上有很多代码,但总结一下:

-(void) viewDidAppear:(BOOL)animated
{
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;

CALayer *viewLayer = self.vImagePreview.layer;
NSLog(@"viewLayer = %@", viewLayer);

AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];

captureVideoPreviewLayer.frame = self.vImagePreview.bounds;
[self.vImagePreview.layer addSublayer:captureVideoPreviewLayer];

AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(@"ERROR: trying to open camera: %@", error);
}
[session addInput:input];

stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];

[session addOutput:stillImageOutput];

[session startRunning];
}

-(IBAction) captureNow
{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}

NSLog(@"about to request a capture from: %@", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments)
{
// Do something with the attachments.
NSLog(@"attachements: %@", exifAttachments);
}
else
NSLog(@"no attachments");

NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];

self.vImage.image = image;
}];
}

关于iPhone SDK 4 AVFoundation - 如何正确使用 captureStillImageAsynchronouslyFromConnection?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/3847140/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com