gpt4 book ai didi

ios - 在 Swift 中从 iOS 摄像头源获取像素值的最有效/实时方式

转载 作者:搜寻专家 更新时间:2023-11-01 06:43:52 24 4
gpt4 key购买 nike

这里有一些关于类似问题的讨论。喜欢this ,但它们似乎已经过时了,所以我想我应该在这里问一下。

我想从 swift 2.0 中的相机馈送中获得近乎实时的 RGB 像素值,甚至更好的是完整图像 RGB 直方图。我希望它尽可能快和最新(理想情况下约为 30 fps 或更高)

我可以直接从 AVCaptureVideoPreviewLayer 获取它还是我需要捕获每一帧(我假设是异步的,如果这个过程需要很长时间)然后从 jpeg/png 渲染中提取像素值?

一些示例代码,取自 jquave但针对 swift 2.0 进行了修改

import UIKit
import AVFoundation

class ViewController: UIViewController {

let captureSession = AVCaptureSession()
var previewLayer : AVCaptureVideoPreviewLayer?

var captureDevice : AVCaptureDevice?

override func viewDidLoad() {
super.viewDidLoad()

// Do any additional setup after loading the view, typically from a nib.
captureSession.sessionPreset = AVCaptureSessionPresetHigh

let devices = AVCaptureDevice.devices()

// Loop through all the capture devices on this phone
for device in devices {
// Make sure this particular device supports video
if (device.hasMediaType(AVMediaTypeVideo)) {
// Finally check the position and confirm we've got the back camera
if(device.position == AVCaptureDevicePosition.Back) {
captureDevice = device as? AVCaptureDevice
if captureDevice != nil {
print("Capture device found")
beginSession()
}
}
}
}
}

func focusTo(value : Float) {
if let device = captureDevice {
do {
try device.lockForConfiguration()
device.setFocusModeLockedWithLensPosition(value, completionHandler: { (time) -> Void in
})
device.unlockForConfiguration()
} catch {
//error message
print("Can't change focus of capture device")
}
}
}

func configureDevice() {
if let device = captureDevice {
do {
try device.lockForConfiguration()
device.focusMode = .Locked
device.unlockForConfiguration()
} catch {
//error message etc.
print("Capture device not configurable")
}
}

}

func beginSession() {

configureDevice()
do {
//try captureSession.addInput(input: captureDevice)
try captureSession.addInput(AVCaptureDeviceInput(device: captureDevice))
updateDeviceSettings(0.0, isoValue: 0.0)
} catch {
//error message etc.
print("Capture device not initialisable")
}
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
self.view.layer.addSublayer(previewLayer!)
previewLayer?.frame = self.view.layer.frame
captureSession.startRunning()
}

func updateDeviceSettings(focusValue : Float, isoValue : Float) {
if let device = captureDevice {
do {
try device.lockForConfiguration()
device.setFocusModeLockedWithLensPosition(focusValue, completionHandler: { (time) -> Void in
//
})

// Adjust the iso to clamp between minIso and maxIso based on the active format
let minISO = device.activeFormat.minISO
let maxISO = device.activeFormat.maxISO
let clampedISO = isoValue * (maxISO - minISO) + minISO

device.setExposureModeCustomWithDuration(AVCaptureExposureDurationCurrent, ISO: clampedISO, completionHandler: { (time) -> Void in
//
})

device.unlockForConfiguration()
} catch {
//error message etc.
print("Can't update device settings")
}

}
}
}

最佳答案

您不需要 AVCaptureVideoPreviewLayer - 如果您想显示 视频,这就是您想要的。相反,您需要不同的输出:AVCaptureVideoDataOutput:

https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVCaptureVideoDataOutput_Class/index.html#//apple_ref/occ/cl/AVCaptureVideoDataOutput

这使您可以直接访问样本缓冲区流,然后可以进入像素空间。

请注意:我不知道当前设备的吞吐量是多少,但我无法从 iPhone 4S 获得最高质量的实时流,因为 GPU<-->CPU 管道太慢了。

关于ios - 在 Swift 中从 iOS 摄像头源获取像素值的最有效/实时方式,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/32642262/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com