gpt4 book ai didi

ios - 如何使用 AVCapturePhotoOutput

转载 作者:IT王子 更新时间:2023-10-29 05:02:03 24 4
gpt4 key购买 nike

我一直致力于使用自定义相机,最近升级到 Xcode 8 beta 和 Swift 3。我最初有这个:

var stillImageOutput: AVCaptureStillImageOutput?

但是,我现在收到警告:

'AVCaptureStillImageOutput' was deprecated in iOS 10.0: Use AVCapturePhotoOutput instead

因为这是相当新的,所以我没有看到太多关于这方面的信息。这是我当前的代码:

var captureSession: AVCaptureSession?
var stillImageOutput: AVCaptureStillImageOutput?
var previewLayer: AVCaptureVideoPreviewLayer?

func clickPicture() {

if let videoConnection = stillImageOutput?.connection(withMediaType: AVMediaTypeVideo) {

videoConnection.videoOrientation = .portrait
stillImageOutput?.captureStillImageAsynchronously(from: videoConnection, completionHandler: { (sampleBuffer, error) -> Void in

if sampleBuffer != nil {

let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
let dataProvider = CGDataProvider(data: imageData!)
let cgImageRef = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent)

let image = UIImage(cgImage: cgImageRef!, scale: 1, orientation: .right)

}

})

}

}

我试过查看 AVCapturePhotoCaptureDelegate,但我不太确定如何使用它。有人知道如何使用这个吗?谢谢。

最佳答案

已更新至 Swift 4您好,使用 AVCapturePhotoOutput 真的很容易。

您需要返回 CMSampleBufferAVCapturePhotoCaptureDelegate

如果您告诉 AVCapturePhotoSettings previewFormat

,您也可以获得预览图像
    class CameraCaptureOutput: NSObject, AVCapturePhotoCaptureDelegate {

let cameraOutput = AVCapturePhotoOutput()

func capturePhoto() {

let settings = AVCapturePhotoSettings()
let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
kCVPixelBufferWidthKey as String: 160,
kCVPixelBufferHeightKey as String: 160]
settings.previewPhotoFormat = previewFormat
self.cameraOutput.capturePhoto(with: settings, delegate: self)

}

func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?, previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {
if let error = error {
print(error.localizedDescription)
}

if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {
print("image: \(UIImage(data: dataImage)?.size)") // Your Image
}
}
}

欲了解更多信息,请访问 https://developer.apple.com/reference/AVFoundation/AVCapturePhotoOutput

注意:您必须在拍照前将AVCapturePhotoOutput 添加到AVCaptureSession 中。所以类似于:session.addOutput(output),然后:output.capturePhoto(with:settings, delegate:self) 谢谢@BigHeadCreations

关于ios - 如何使用 AVCapturePhotoOutput,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/37869963/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com