gpt4 book ai didi

ios - 捕获完成后从 AVCapturePhotoOutput 获取使用的曝光持续时间和 ISO 值

转载 作者:行者123 更新时间:2023-11-30 11:47:08 25 4
gpt4 key购买 nike

背景

我使用 AVCaptureSession 和 AVCapturePhotoOutput 将捕获保存为 JPEG 图像。

let captureSession = AVCaptureSession()
let stillImageOutput = AVCapturePhotoOutput()
var captureDevice : AVCaptureDevice?

...

func setupCamera() {

captureDevice = AVCaptureDevice.default(AVCaptureDevice.DeviceType.builtInWideAngleCamera, for: AVMediaType.video, position: .back)

if (captureDevice != nil) {

captureSession.addInput(try AVCaptureDeviceInput(device: captureDevice!))

if captureSession.canAddOutput(stillImageOutput) {
captureSession.addOutput(stillImageOutput)
}

}

}

AVCaptureDevice 设置为自动连续调整曝光设置

func configureCamera() {

do {

try captureDevice?.lockForConfiguration()

captureDevice?.exposureMode = AVCaptureDevice.ExposureMode.continuousAutoExposure

captureDevice?.unlockForConfiguration()

} catch let error as NSError {
// Errors handled here...
}

}

捕获开始于

func capture(){

// Get an instance of AVCapturePhotoSettings class
let photoSettings = AVCapturePhotoSettings()

// Set photo settings
photoSettings.isAutoStillImageStabilizationEnabled = true
photoSettings.flashMode = .off

// Call capturePhoto method by passing photo settings and a
// delegate implementing AVCapturePhotoCaptureDelegate
stillImageOutput.capturePhoto(with: photoSettings, delegate: self)

}

父类设置为 AVCapturePhotoCaptureDelegate 并由其处理 photoOutput

//Delegate
func photoOutput(_ captureOutput: AVCapturePhotoOutput,
didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?,
previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?,
resolvedSettings: AVCaptureResolvedPhotoSettings,
bracketSettings: AVCaptureBracketedStillImageSettings?,
error: Error?) {

// Make sure there is a photo sample buffer
guard error == nil,
let photoSampleBuffer = photoSampleBuffer else {
//Errors handled here
return
}

// Convert photo same buffer to a jpeg image data by using // AVCapturePhotoOutput
guard let imageData =
AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: photoSampleBuffer, previewPhotoSampleBuffer: previewPhotoSampleBuffer) else {
return
}

let capturedImage = UIImage.init(data: imageData , scale: 1.0)

if let image = capturedImage {
//save photo ...
}

}

然而,一切都按其应有的方式进行......

问题

我需要知道每次拍摄时使用的曝光持续时间和 ISO 值。这些值会有所不同,因为相机设置为自动调整曝光,并且必须如此。

我知道捕获的元数据保存这些值,但我不知道如何访问它们。

曝光持续时间和 ISO 值对于微调曝光以获得最佳效果是必要的。微调后,使用这些手动曝光值开始捕捉

captureDevice?.setExposureModeCustom(duration: customTime, iso: customISO, completionHandler: nil)

最佳答案

我没有从捕捉元数据中获取使用的 ISO 和曝光持续时间,而是在捕捉照片之前读取这些值。当这样做时,重要的是检查曝光是否已完成调整。

在调用捕获之前:

检查自动曝光是否未调整

while ((captureDevice?.isAdjustingExposure)!){
usleep(100000) // wait 100 msec
}

读取当前曝光参数

let current_exposure_duration : CMTime = (captureDevice?.exposureDuration)!
let current_exposure_ISO : Float = (captureDevice?.iso)!

然后拍照

stillImageOutput.capturePhoto(with: photoSettings, delegate: self)

关于ios - 捕获完成后从 AVCapturePhotoOutput 获取使用的曝光持续时间和 ISO 值,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/48702881/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com