gpt4 book ai didi

IOS/objective-C : Display live camera preview on load without image picker controller

转载 作者:行者123 更新时间:2023-11-28 20:56:18 27 4
gpt4 key购买 nike

我不希望在屏幕上显示静态照片图像要求提供个人资料照片,而是希望屏幕在实时相机 View 中打开。之后,我不介意使用uipickercontroller来抓取照片。但是,我希望用户立即看到一些东西而不是静态图像。

我是否需要像 this answer for Swift 中那样使用 AVFoundation?或者在 Objective-C 中执行此操作的最简单方法是什么。

这是一些在 Swift 中使用 AVFoundation 的代码,来自 SO question , 然而,我不擅长 Swift,想在 Objective-C 中实现;

extension SelfieViewController:  AVCaptureVideoDataOutputSampleBufferDelegate{
func setupAVCapture(){
session.sessionPreset = AVCaptureSessionPreset640x480

let devices = AVCaptureDevice.devices();
// Loop through all the capture devices on this phone
for device in devices {
// Make sure this particular device supports video
if (device.hasMediaType(AVMediaTypeVideo)) {
// Finally check the position and confirm we've got the front camera
if(device.position == AVCaptureDevicePosition.Front) {
captureDevice = device as? AVCaptureDevice
if captureDevice != nil {
beginSession()
break
}
}
}
}
}

func beginSession(){
var err : NSError? = nil
var deviceInput:AVCaptureDeviceInput = AVCaptureDeviceInput(device: captureDevice, error: &err)
if err != nil {
println("error: \(err?.localizedDescription)")
}
if self.session.canAddInput(deviceInput){
self.session.addInput(deviceInput)
}

self.videoDataOutput = AVCaptureVideoDataOutput()
var rgbOutputSettings = [NSNumber(integer: kCMPixelFormat_32BGRA):kCVPixelBufferPixelFormatTypeKey]
self.videoDataOutput.alwaysDiscardsLateVideoFrames=true
self.videoDataOutputQueue = dispatch_queue_create("VideoDataOutputQueue", DISPATCH_QUEUE_SERIAL)
self.videoDataOutput.setSampleBufferDelegate(self, queue:self.videoDataOutputQueue)
if session.canAddOutput(self.videoDataOutput){
session.addOutput(self.videoDataOutput)
}
self.videoDataOutput.connectionWithMediaType(AVMediaTypeVideo).enabled = true

self.previewLayer = AVCaptureVideoPreviewLayer(session: self.session)
self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill

var rootLayer :CALayer = self.cameraView.layer
rootLayer.masksToBounds=true
self.previewLayer.frame = rootLayer.bounds
rootLayer.addSublayer(self.previewLayer)
session.startRunning()

}

func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
// do stuff here
}

// clean up AVCapture
func stopCamera(){
session.stopRunning()
}

}

提前感谢您的任何建议。

最佳答案

我只是将示例代码翻译成 Obj-C。如果你想了解更多,也许你可以看看我的项目FaceDetectionDemo .希望对你有所帮助。

- (void)setupAVCapture {
NSError *error = nil;

// Select device
AVCaptureSession *session = [[AVCaptureSession alloc] init];
if ([[UIDevice currentDevice] userInterfaceIdiom] ==
UIUserInterfaceIdiomPhone) {
[session setSessionPreset:AVCaptureSessionPreset640x480];
} else {
[session setSessionPreset:AVCaptureSessionPresetPhoto];
}

AVCaptureDevice *device = [self findFrontCamera];
if (nil == device) {
self.isUsingFrontFacingCamera = NO;
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}

// get the input device
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput
deviceInputWithDevice:device error:&error];
if (error) {
session = nil;
[self teardownAVCapture];
if ([_delegate
respondsToSelector:@selector(FaceDetectionComponentError:error:)]) {
__weak typeof(self) weakSelf = self;
dispatch_async(dispatch_get_main_queue(), ^{
[weakSelf.delegate FaceDetectionComponentError:weakSelf
error:error];
});
}
return;
}

// add the input to the session
if ([session canAddInput:deviceInput]) {
[session addInput:deviceInput];
}

// Make a video data output
self.videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];

// We want RGBA, both CoreGraphics and OpenGL work well with 'RGBA'
NSDictionary *rgbOutputSettings = [NSDictionary dictionaryWithObject:
[NSNumber
numberWithInt:kCMPixelFormat_32BGRA] forKey:
(id)kCVPixelBufferPixelFormatTypeKey];
[self.videoDataOutput setVideoSettings:rgbOutputSettings];
[self.videoDataOutput setAlwaysDiscardsLateVideoFrames:YES]; // discard if the data output queue is blocked

self.videoDataOutputQueue = dispatch_queue_create("VideoDataOutputQueue",
DISPATCH_QUEUE_SERIAL);
[self.videoDataOutput setSampleBufferDelegate:self
queue:self.videoDataOutputQueue];

if ([session canAddOutput:self.videoDataOutput]) {
[session addOutput:self.videoDataOutput];
}

[[self.videoDataOutput connectionWithMediaType:AVMediaTypeVideo]
setEnabled:YES];

self.previewLayer = [[AVCaptureVideoPreviewLayer alloc]
initWithSession:session];
self.previewLayer.backgroundColor = [[UIColor blackColor] CGColor];
self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspect;

CALayer *rootLayer = [self.previewView layer];
[rootLayer setMasksToBounds:YES];
[self.previewLayer setFrame:[rootLayer bounds]];
[rootLayer addSublayer:self.previewLayer];
[session startRunning];
}

- (AVCaptureDevice *)findFrontCamera {
AVCaptureDevicePosition desiredPosition = AVCaptureDevicePositionFront;
for (AVCaptureDevice *d in [AVCaptureDevice
devicesWithMediaType:AVMediaTypeVideo]) {
if ([d position] == desiredPosition) {
self.isUsingFrontFacingCamera = YES;
return d;
}
}
return nil;
}

// AVCaptureVideoDataOutputSampleBufferDelegate
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection {

}

关于IOS/objective-C : Display live camera preview on load without image picker controller,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/52086086/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com