gpt4 book ai didi

ios - Swift - captureOutput 未被执行

转载 作者:行者123 更新时间:2023-11-28 07:28:03 24 4
gpt4 key购买 nike

我目前正在尝试为我的应用程序实现摄像头实时馈送。我已经设置好了,但不知怎么的,它没有按预期工作。据我所知,captureOutput 应该在每次识别帧时执行,并且打印消息应该在控制台中输出,但不知何故它不是 - 控制台不会显示打印命令。

有人在代码中看到任何可能的错误吗?

我不知道这是否与我的问题有关,但在应用程序启动时,控制台显示以下内容:

[BoringSSL] nw_protocol_boringssl_get_output_frames(1301) [C1.1:2][0x106b24530] get output frames failed, state 8196

import UIKit
import AVKit
import Vision

class CameraViewController: UIViewController, AVCaptureVideoDataOutputSampleBufferDelegate {

override func viewDidLoad() {
super.viewDidLoad()

let captureSession = AVCaptureSession()

guard let captureDevice = AVCaptureDevice.default(for: .video) else { return }
guard let input = try? AVCaptureDeviceInput(device: captureDevice) else{ return }
captureSession.addInput(input)

captureSession.startRunning()

let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
view.layer.addSublayer(previewLayer)
previewLayer.frame = view.frame

let dataOutput = AVCaptureVideoDataOutput()
dataOutput.setSampleBufferDelegate(self, queue: DispatchQueue(label: "videoQueue"))
captureSession.addOutput(dataOutput)

// let request = VNCoreMLRequest
// VNImageRequestHandler(cgImage: <#T##CGImage#>, options: [:]).perform(request)
}

func captureOutput(_ output: AVCaptureOutput, didDrop sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
print("Es hat funktioniert")
}

}

最佳答案

你需要实现 captureOutput(_:didOutput:from:) 而不是 captureOutput(_:didDrop:from:)

func captureOutput(_ output: AVCaptureOutput,
didOutput sampleBuffer: CMSampleBuffer,
from connection: AVCaptureConnection) {
print("Es hat funktioniert")
}

关于ios - Swift - captureOutput 未被执行,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/55835264/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com