gpt4 book ai didi

iOS Swift - AVCaptureSession - 根据帧速率捕获帧

转载 作者:搜寻专家 更新时间:2023-10-31 21:55:26 24 4
gpt4 key购买 nike

我正在尝试构建一个应用程序,该应用程序将从相机捕获帧并使用 OpenCV 处理它们,然后再将这些文件保存到设备,但以特定的帧速率。

我现在坚持的事实是 AVCaptureVideoDataOutputSampleBufferDelegate 似乎不遵守 AVCaptureDevice.activeVideoMinFrameDurationAVCaptureDevice.activeVideoMaxFrameDuration 设置。

captureOutput 的运行速度远高于上述设置所示的每秒 2 帧。

您是否碰巧知道如何在有或没有委托(delegate)的情况下实现这一目标?

View Controller :

override func viewDidLoad() {
super.viewDidLoad()

}

override func viewDidAppear(animated: Bool) {
setupCaptureSession()
}

func setupCaptureSession() {

let session : AVCaptureSession = AVCaptureSession()
session.sessionPreset = AVCaptureSessionPreset1280x720

let videoDevices : [AVCaptureDevice] = AVCaptureDevice.devices() as! [AVCaptureDevice]

for device in videoDevices {
if device.position == AVCaptureDevicePosition.Back {
let captureDevice : AVCaptureDevice = device

do {
try captureDevice.lockForConfiguration()
captureDevice.activeVideoMinFrameDuration = CMTimeMake(1, 2)
captureDevice.activeVideoMaxFrameDuration = CMTimeMake(1, 2)
captureDevice.unlockForConfiguration()

let input : AVCaptureDeviceInput = try AVCaptureDeviceInput(device: captureDevice)

if session.canAddInput(input) {
try session.addInput(input)
}

let output : AVCaptureVideoDataOutput = AVCaptureVideoDataOutput()

let dispatch_queue : dispatch_queue_t = dispatch_queue_create("streamoutput", nil)
output.setSampleBufferDelegate(self, queue: dispatch_queue)

session.addOutput(output)

session.startRunning()

let previewLayer = AVCaptureVideoPreviewLayer(session: session)
previewLayer.connection.videoOrientation = .LandscapeRight

let previewBounds : CGRect = CGRectMake(0,0,self.view.frame.width/2,self.view.frame.height+20)
previewLayer.backgroundColor = UIColor.blackColor().CGColor
previewLayer.frame = previewBounds
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
self.imageView.layer.addSublayer(previewLayer)

self.previewMat.frame = CGRectMake(previewBounds.width, 0, previewBounds.width, previewBounds.height)

} catch _ {

}
break
}
}

}

func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
self.wrapper.processBuffer(self.getUiImageFromBuffer(sampleBuffer), self.previewMat)
}

最佳答案

所以我找到了问题。

activeVideoMinFrameDuration 属性上方的 AVCaptureDevice.h 的注释部分中,它指出:

On iOS, the receiver's activeVideoMinFrameDuration resets to its default value under the following conditions:

  • The receiver's activeFormat changes
  • The receiver's AVCaptureDeviceInput's session's sessionPreset changes
  • The receiver's AVCaptureDeviceInput is added to a session

最后一个要点导致了我的问题,所以执行以下操作为我解决了问题:

        do {

let input : AVCaptureDeviceInput = try AVCaptureDeviceInput(device: captureDevice)

if session.canAddInput(input) {
try session.addInput(input)
}

try captureDevice.lockForConfiguration()
captureDevice.activeVideoMinFrameDuration = CMTimeMake(value: 1, timescale: 2)
captureDevice.activeVideoMaxFrameDuration = CMTimeMake(value: 1, timescale: 2)
captureDevice.unlockForConfiguration()

let output : AVCaptureVideoDataOutput = AVCaptureVideoDataOutput()

let dispatch_queue : dispatch_queue_t = dispatch_queue_create("streamoutput", nil)
output.setSampleBufferDelegate(self, queue: dispatch_queue)

session.addOutput(output)

关于iOS Swift - AVCaptureSession - 根据帧速率捕获帧,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/34718833/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com