gpt4 book ai didi

streaming - 如何在 swiftui 中使用实时相机流?

转载 作者:行者123 更新时间:2023-12-04 12:20:58 29 4
gpt4 key购买 nike

我像这样构建了一个 StreamingView:

struct StreamingView: UIViewRepresentable {

func updateUIView(_ uiView: UIView, context: UIViewRepresentableContext<StreamingView>) {
//
}

func makeUIView(context: UIViewRepresentableContext<StreamingView>) -> UIView {
let view = UIView()

let captureSession = AVCaptureSession()
captureSession.sessionPreset = .photo

guard let captureDevice = AVCaptureDevice.default(for: .video) else { return view}
guard let input = try? AVCaptureDeviceInput(device: captureDevice) else { return view}
captureSession.addInput(input)

captureSession.startRunning()

let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
view.layer.addSublayer(previewLayer)
previewLayer.frame = view.frame

return view
}
}

但它没有用。我怎样才能为流媒体构建一个纯粹的 swiftui View ?

最佳答案

试试下面的演示代码

注意:确保完成所有准备工作,例如在功能中打开相机,在 Info.plist 中添加 NSCameraUsageDescription ......并且相机只能在真实设备上测试。

import SwiftUI
import UIKit
import AVFoundation

class PreviewView: UIView {
private var captureSession: AVCaptureSession?

init() {
super.init(frame: .zero)

var allowedAccess = false
let blocker = DispatchGroup()
blocker.enter()
AVCaptureDevice.requestAccess(for: .video) { flag in
allowedAccess = flag
blocker.leave()
}
blocker.wait()

if !allowedAccess {
print("!!! NO ACCESS TO CAMERA")
return
}

// setup session
let session = AVCaptureSession()
session.beginConfiguration()

let videoDevice = AVCaptureDevice.default(.builtInWideAngleCamera,
for: .video, position: .unspecified) //alternate AVCaptureDevice.default(for: .video)
guard videoDevice != nil, let videoDeviceInput = try? AVCaptureDeviceInput(device: videoDevice!), session.canAddInput(videoDeviceInput) else {
print("!!! NO CAMERA DETECTED")
return
}
session.addInput(videoDeviceInput)
session.commitConfiguration()
self.captureSession = session
}

override class var layerClass: AnyClass {
AVCaptureVideoPreviewLayer.self
}

required init?(coder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}

var videoPreviewLayer: AVCaptureVideoPreviewLayer {
return layer as! AVCaptureVideoPreviewLayer
}

override func didMoveToSuperview() {
super.didMoveToSuperview()

if nil != self.superview {
self.videoPreviewLayer.session = self.captureSession
self.videoPreviewLayer.videoGravity = .resizeAspect
self.captureSession?.startRunning()
} else {
self.captureSession?.stopRunning()
}
}
}

struct PreviewHolder: UIViewRepresentable {
func makeUIView(context: UIViewRepresentableContext<PreviewHolder>) -> PreviewView {
PreviewView()
}

func updateUIView(_ uiView: PreviewView, context: UIViewRepresentableContext<PreviewHolder>) {
}

typealias UIViewType = PreviewView
}

struct DemoVideoStreaming: View {
var body: some View {
VStack {
PreviewHolder()
}.frame(minWidth: 0, maxWidth: .infinity, minHeight: 0, maxHeight: .infinity, alignment: .center)
}
}

关于streaming - 如何在 swiftui 中使用实时相机流?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/59062886/

29 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com