gpt4 book ai didi

ios - Sample Buffer Delegate Swift 2 用于实时视频过滤器

转载 作者:行者123 更新时间:2023-11-28 08:41:53 25 4
gpt4 key购买 nike

我正在尝试使用 iPhone 上的摄像头快速创建一个光强度读取器。这个想法是它采用所有像素的强度分量并将它们平均给我一个单一的值。我不需要相机的预览。我一直在拼凑几个教程来尝试让它工作,到目前为止已经想出了下面的代码。camDeviceSetup() 在 ViewDidLoad 上运行,cameraSetup() 在按下按钮时运行。

我在开始“videoDeviceOutput!.setSampleBufferDelegate”的那一行遇到错误,它说它无法将类型 FirstViewController( View Controller )的值转换为预期的参数。

let captureSession = AVCaptureSession()
// If we find a device we'll store it here for later use
var captureDevice : AVCaptureDevice?
var videoDeviceOutput: AVCaptureVideoDataOutput?
// AVCaptureVideoPreviewLayer is a subclass of CALayer that you use to display video as it is being captured by an input device.
var previewLayer = AVCaptureVideoPreviewLayer()

func camDeviceSetup() {
captureSession.sessionPreset = AVCaptureSessionPreset640x480
let devices = AVCaptureDevice.devices()
for device in devices {
// Make sure this particular device supports video
if (device.hasMediaType(AVMediaTypeVideo)) {
// Finally check the position and confirm we've got the back camera
if(device.position == AVCaptureDevicePosition.Back) {
captureDevice = device as? AVCaptureDevice
}
}
}
if captureDevice != nil {
let err : NSError? = nil
captureSession.addInput(try! AVCaptureDeviceInput(device: captureDevice))

if err != nil {
print("error: \(err?.localizedDescription)")
}

}
}

func cameraSetup() {
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer.frame = view.bounds
view.layer.addSublayer(previewLayer)

videoDeviceOutput = AVCaptureVideoDataOutput()
videoDeviceOutput!.videoSettings = [kCVPixelBufferPixelFormatTypeKey:Int(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)]
videoDeviceOutput!.alwaysDiscardsLateVideoFrames = true

//This is the line that gets stuck and not sure why
videoDeviceOutput!.setSampleBufferDelegate(self, queue: dispatch_queue_create("VideoBuffer", DISPATCH_QUEUE_SERIAL))

if captureSession.canAddOutput(videoDeviceOutput) {
captureSession.addOutput(videoDeviceOutput)
}

captureSession.startRunning()
}

func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
// Think once the delegate is correctly set my algorithm for finding light intensity goes here

}

最佳答案

该行的问题是我没有在我的 ViewController 顶部的类中声明 AVCaptureVideoDataOutputSampleBufferDelegate。

关于ios - Sample Buffer Delegate Swift 2 用于实时视频过滤器,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/36313943/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com