gpt4 book ai didi

ios - AVCaptureVideoDataOutputSampleBufferDelegate.CaptureOutput 未调用

转载 作者:行者123 更新时间:2023-11-30 13:38:26 27 4
gpt4 key购买 nike

我目前有一个自研框架(MySDK),以及一个使用MySDK的iOS应用程序(MyApp)。

在 MySDK 内部,我在 MySDK 中有一个类(Scanner),用于处理来自设备摄像头视频输出的图像。

这是我的代码示例:

扫描仪.swift

class Scanner: NSObject, AVCaptureVideoDataOutputSampleBufferDelegate {

var captureDevice : AVCaptureDevice?
var captureOutput : AVCaptureVideoDataOutput?
var previewLayer : AVCaptureVideoPreviewLayer?
var captureSession : AVCaptureSession?

var rootViewController : UIViewController?

func scanImage (viewController: UIViewController)
{
NSLog("%@", "scanning begins!")

if (captureSession == nil) { captureSession = AVCaptureSession() }

rootViewController = viewController;

captureSession!.sessionPreset = AVCaptureSessionPresetHigh

let devices = AVCaptureDevice.devices()

for device in devices {
if (device.hasMediaType(AVMediaTypeVideo)) {
if(device.position == AVCaptureDevicePosition.Back) {
captureDevice = device as? AVCaptureDevice
}
}
}

if (captureDevice != nil) {
NSLog("%@", "beginning session!")

beginSession()
}
}

func beginSession()
{
if (captureSession == nil) { captureSession = AVCaptureSession() }
if (captureOutput == nil) { captureOutput = AVCaptureVideoDataOutput() }
if (previewLayer == nil) { previewLayer = AVCaptureVideoPreviewLayer() }

let queue = dispatch_queue_create("myQueue", DISPATCH_QUEUE_SERIAL);

captureOutput!.setSampleBufferDelegate(self, queue: queue)
captureOutput!.videoSettings = [kCVPixelBufferPixelFormatTypeKey as NSString: Int(kCVPixelFormatType_32BGRA)]

captureSession!.addInput(try! AVCaptureDeviceInput(device: captureDevice))
captureSession!.addOutput(captureOutput)

previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer!.frame = rootViewController!.view.layer.frame

rootViewController!.view.layer.addSublayer(previewLayer!)

captureSession!.startRunning()
}

func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBufferRef!, fromConnection connection: AVCaptureConnection!)
{
NSLog("%@", "captured!")
}
}

在 MyApp 中,我有一个 ViewController,它实现了一个 IBAction,其中初始化了 Scanner 类,并触发了 scanImage 函数。

MyApp.m:

- (IBAction)btnScanImage_TouchDown:(id)sender
{
Scanner * scanner = [[Scanner alloc] init];

[scanner scanImage:self];
}

相机 View 出现在应用程序内部,但 captureOutput 函数从未被触发,并且控制台仅包含这两行:

2016-03-07 11:11:45.860 myapp[1236:337377] scanning begins!
2016-03-07 11:11:45.984 myapp[1236:337377] beginning session!

创建一个独立的应用程序,并将 Scanner.swift 中的代码嵌入到 ViewController 中效果很好; captureOutput 函数正确触发。

有人知道我在这里做错了什么吗?

最佳答案

经过多次尝试和错误,我终于找到了解决问题的方法。

显然,我没有将 Scanner 对象创建为变量,而只是将本地变量创建为本地变量。

Scanner 对象创建为变量后,委托(delegate)方法 captureOutput 就会正确触发。

关于ios - AVCaptureVideoDataOutputSampleBufferDelegate.CaptureOutput 未调用,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/35835920/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com