gpt4 book ai didi

ios - Swift - 使用 AVPlayerItemVideoOutput 访问解码帧 : outputMediaDataWillChange is not called

转载 作者:行者123 更新时间:2023-12-03 08:41:50 27 4
gpt4 key购买 nike

我有一个应用程序可以播放从用户库中选择的视频。我希望应用程序最终能够做到的是将叠加渲染到视频上(在播放时),然后将结果输出到新的媒体文件。为此,我需要捕获解码的帧,以便在视频播放结束后渲染此叠加并输出到文件。

这是我使用 AVFoundation 的第一个应用程序,我花了一两天时间试图通过谷歌和 Apple 文档找出如何实现这一目标,我认为我在 AVPlayerItemVideoOutput 对象中有一些东西。但是,委托(delegate)回调永远不会执行。

我发现 AVPlayerItemVideoOutput 必须在 AVPlayerItem 处于 readyToPlay 状态之后创建。因此,在我的 PlayerUIView 的初始化程序中,我向 AVPlayerItem 添加了一个观察者以观察其状态。

init(frame: CGRect, url: Binding<URL?>) {
_url = url
// Setup the player
player = AVPlayer(url: url.wrappedValue!)
super.init(frame: frame)

playerLayer.player = player
playerLayer.videoGravity = .resizeAspect
layer.addSublayer(playerLayer)

//displayLink = CADisplayLink()

// Setup looping
player.actionAtItemEnd = .none
NotificationCenter.default.addObserver(self,
selector: #selector(playerItemDidReachEnd(notification:)),
name: .AVPlayerItemDidPlayToEndTime,
object: player.currentItem)

player.currentItem?.addObserver(self, forKeyPath: #keyPath(AVPlayerItem.status), options: [.old, .new], context: nil)

// Start the movie
player.play()
}

我在中间创建了一个 CADisplayLink - 注释掉了 - 因为我看到它可以以某种方式用于此目的,但不完全确定它应该如何或做什么。还从名称来看,它从显示的视频中获取帧,而不是从实际解码的视频帧中获取帧,这正是我想要的。

当状态第一次设置为 readyToPlay 时,我创建并添加 AVPlayerItemVideoOutput

override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if let item = object as? AVPlayerItem {
if item.status == AVPlayerItem.Status.readyToPlay && item.outputs.count == 0 {
let settings = [ String(kCVPixelBufferPixelFormatTypeKey): kCVPixelFormatType_24RGB ]
let output = AVPlayerItemVideoOutput(pixelBufferAttributes: settings)

output.setDelegate(PlayerOutput(output: output), queue: DispatchQueue(label: ""))

player.currentItem?.add(output)
}
}
}

在委托(delegate)PlayerOutput上,我希望在新帧可用时收到通知。此时我将访问 AVPlayerItemVideoOutput 对象来访问像素缓冲区。

class PlayerOutput : NSObject, AVPlayerItemOutputPullDelegate {

func outputMediaDataWillChange(_ sender: AVPlayerItemOutput) {
let videoOutput = sender as! AVPlayerItemVideoOutput
let newPixelBuff = videoOutput.hasNewPixelBuffer(forItemTime: CMTime(seconds: 1, preferredTimescale: 10))
}
}

但是,这个回调永远不会发生。我在代码中设置了断点,但从未命中。根据 AVFoundation 中其他地方的命名和类似代码,我假设每个新帧都会命中它,这样我就可以访问缓冲区中的帧,但我没有看到任何情况发生。我是否遗漏或做错了什么?

我有一种感觉,我不太正确地使用/理解这些类以及它们的用途,但它在命名法上与我已成功实现的 AVCaptureVideoDataOutput 等类相似在应用程序的其他地方,它们的工作方式似乎不太一样。很难找到任何例子来实现我想要用 AVPlayer 做的事情。

编辑:当前代码的工作示例:

import SwiftUI
import AVFoundation

struct CustomCameraPhotoView: View {

@State private var image: Image?
@State private var showingCustomCamera = false
@State private var showImagePicker = false
@State private var inputImage: UIImage?
@State private var url: URL?

var body: some View {
ZStack {
if url != nil
{
PlayerView(url: $url)
}
else
{
Button(action: {
self.showImagePicker = true
}) {
Text("Select a Video").foregroundColor(.white).font(.headline)
}

}
}.edgesIgnoringSafeArea(.all)
.sheet(isPresented: $showImagePicker,
onDismiss: loadImage) {
PhotoCaptureView(showImagePicker: self.$showImagePicker, image: self.$image, url: self.$url)
}.edgesIgnoringSafeArea(.leading).edgesIgnoringSafeArea(.trailing)
}
func loadImage() {
guard let inputImage = inputImage else { return }
image = Image(uiImage: inputImage)
}
}

struct PlayerView: UIViewControllerRepresentable {

@Binding var url: URL?

func updateUIViewController(_ uiView: UIViewController, context: UIViewControllerRepresentableContext<PlayerView>) {
}

func makeCoordinator() -> PlayerCoordinator{
//Make Coordinator which will commnicate with the ImagePickerViewController
PlayerCoordinator()
}

func makeUIViewController(context: Context) -> UIViewController {
let view = PlayerUIView(frame: .zero, url: $url)
let controller = PlayerController()
controller.view = view

return controller
}
}

class PlayerCoordinator : NSObject, UINavigationControllerDelegate {

}

class PlayerController: UIViewController {
override var shouldAutorotate: Bool {
return false
}

override var supportedInterfaceOrientations: UIInterfaceOrientationMask {
return .all
}
}

class PlayerUIView: UIView {
private let playerLayer = AVPlayerLayer()
private var playerOutput = PlayerOutput()
private let _myVideoOutputQueue = DispatchQueue(label: "VideoFrames", qos: .background, attributes: .concurrent, autoreleaseFrequency: .workItem, target: nil)

var displayLink: CADisplayLink?
var player: AVPlayer

@Binding var url: URL?

required init?(coder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}

init(frame: CGRect, url: Binding<URL?>) {
_url = url
// Setup the player
player = AVPlayer(url: url.wrappedValue!)
super.init(frame: frame)

playerLayer.player = player
playerLayer.videoGravity = .resizeAspect
layer.addSublayer(playerLayer)

let settings = [ String(kCVPixelBufferPixelFormatTypeKey): kCVPixelFormatType_32BGRA ]
let output = AVPlayerItemVideoOutput(pixelBufferAttributes: settings)

output.setDelegate(self.playerOutput, queue: self._myVideoOutputQueue)

player.currentItem?.add(output)

//displayLink = CADisplayLink()

// Setup looping
player.actionAtItemEnd = .none
NotificationCenter.default.addObserver(self,
selector: #selector(playerItemDidReachEnd(notification:)),
name: .AVPlayerItemDidPlayToEndTime,
object: player.currentItem)

player.currentItem?.addObserver(self, forKeyPath: #keyPath(AVPlayerItem.status), options: [.old, .new], context: nil)

// Start the movie
player.play()
}

override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if let item = object as? AVPlayerItem {
if item.status == AVPlayerItem.Status.readyToPlay && item.outputs.count == 0 {

}
}
}

@objc
func playerItemDidReachEnd(notification: Notification) {
self.url = nil
}

override func layoutSubviews() {
super.layoutSubviews()
playerLayer.frame = bounds
}

class PlayerOutput : NSObject, AVPlayerItemOutputPullDelegate {

func outputMediaDataWillChange(_ sender: AVPlayerItemOutput) {
let videoOutput = sender as! AVPlayerItemVideoOutput
let newPixelBuff = videoOutput.hasNewPixelBuffer(forItemTime: CMTime(seconds: 1, preferredTimescale: 10))
}
}
}

struct ImagePicker : UIViewControllerRepresentable {
@Binding var isShown : Bool
@Binding var image : Image?
@Binding var url : URL?

func updateUIViewController(_ uiViewController: UIImagePickerController, context: UIViewControllerRepresentableContext<ImagePicker>)
{
//Update UIViewcontrolleer Method
}
func makeCoordinator() -> ImagePickerCoordinator{
//Make Coordinator which will commnicate with the ImagePickerViewController
ImagePickerCoordinator(isShown: $isShown, image: $image, url: $url)
}
func makeUIViewController(context: UIViewControllerRepresentableContext<ImagePicker>) -> UIImagePickerController
{
let picker = UIImagePickerController()
picker.sourceType = .photoLibrary
picker.delegate = context.coordinator
picker.mediaTypes = ["public.movie"]
picker.videoQuality = .typeHigh
return picker
}
}

class ImagePickerCoordinator : NSObject, UINavigationControllerDelegate, UIImagePickerControllerDelegate{
@Binding var isShown : Bool
@Binding var image : Image?
@Binding var url: URL?
init(isShown : Binding<Bool>, image: Binding<Image?>, url: Binding<URL?>) {
_isShown = isShown
_image = image
_url = url
}
//Selected Image
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {
let uiImage = info[UIImagePickerController.InfoKey.mediaURL] as! URL
url = uiImage
//image = Image(uiImage: uiImage)
isShown = false
}
//Image selection got cancel
func imagePickerControllerDidCancel(_ picker: UIImagePickerController) {
isShown = false
}
}

struct PhotoCaptureView: View {
@Binding var showImagePicker : Bool
@Binding var image : Image?
@Binding var url : URL?

var body: some View {
ImagePicker(isShown: $showImagePicker, image: $image, url: $url)
}
}

struct ContentView_Previews: PreviewProvider {
static var previews: some View {
CustomCameraPhotoView()
}
}

最佳答案

编辑:您怀疑 outputMediaDataWillChangeAVCaptureVideoDataOutput 是完全不同的野兽,这是正确的。它不是每帧调用的,而是指示一些玩家状态的变化(我有点不清楚到底是什么情况)。有(至少)两个选项用于读取和修改视频输出。

  1. 您可以将 videoComposition 添加到当前 playerItem 中,您可以在其中使用 CoreImage 工具来调整视频(例如添加叠加层)。在此示例中,我对输出应用了一个简单的模糊滤镜,但 CoreImage 允许您执行更复杂的操作。
let blurFilter = CIFilter(name: "CIGaussianBlur")
if let playerItem = player.currentItem {
let asset = playerItem.asset

playerItem.videoComposition = AVMutableVideoComposition(asset: asset) { (filteringRequest) in
let source = filteringRequest.sourceImage
blurFilter?.setValue(source, forKey: kCIInputImageKey)

filteringRequest.finish(with: blurFilter?.outputImage ?? source, context: nil)
}
}
  • 如果您需要实际的像素缓冲区,那么 CADisplayLink 就是您的正确选择。显示链接允许您将操作与显示刷新率同步。您可以通过以下方式从视频输出中抓取帧:
  • lazy var displayLink: CADisplayLink = CADisplayLink(target: self, selector: #selector(displayLinkDidRefresh(link:)))

    init(frame: CGRect, url: Binding<URL?>) {
    ...
    // activate the displayLink
    displayLink.add(to: .main, forMode: .common)
    ...
    }

    @objc func displayLinkDidRefresh(link: CADisplayLink) {
    guard let videoOutput = self.videoOutput else { return }

    let itemTime = player.currentTime()
    if videoOutput.hasNewPixelBuffer(forItemTime: itemTime) {
    var presentationItemTime: CMTime = .zero
    if let pixelBuffer = videoOutput.copyPixelBuffer(forItemTime: itemTime, itemTimeForDisplay: &presentationItemTime) {

    // process the pixelbuffer here
    }
    }
    }

    完整的最小示例:

    import SwiftUI
    import AVFoundation

    struct CustomCameraPhotoView: View {

    @State private var image: Image?
    @State private var showingCustomCamera = false
    @State private var showImagePicker = false
    @State private var inputImage: UIImage?
    @State private var url: URL?

    var body: some View {
    ZStack {
    if url != nil {
    PlayerView(url: $url)
    } else {
    Button(action: {
    self.showImagePicker = true
    }) {
    Text("Select a Video")
    .font(.headline)
    }
    }
    } .edgesIgnoringSafeArea(.all)
    .sheet(isPresented: $showImagePicker,
    onDismiss: loadImage) {
    PhotoCaptureView(showImagePicker: self.$showImagePicker, image: self.$image, url: self.$url)
    }
    .edgesIgnoringSafeArea(.leading).edgesIgnoringSafeArea(.trailing)
    }

    func loadImage() {
    guard let inputImage = inputImage else { return }
    image = Image(uiImage: inputImage)
    }
    }

    struct PlayerView: UIViewControllerRepresentable {

    @Binding var url: URL?

    func updateUIViewController(_ uiView: UIViewController, context: UIViewControllerRepresentableContext<PlayerView>) {
    }

    func makeUIViewController(context: Context) -> UIViewController {
    let view = PlayerUIView(frame: .zero, url: $url)
    let controller = PlayerController()
    controller.view = view

    return controller
    }
    }

    class PlayerController: UIViewController {
    override var shouldAutorotate: Bool { false }
    override var supportedInterfaceOrientations: UIInterfaceOrientationMask { .all }
    }

    class PlayerUIView: UIView {
    private let playerLayer = AVPlayerLayer()
    private let _myVideoOutputQueue = DispatchQueue(label: "VideoFrames", qos: .background, attributes: .concurrent, autoreleaseFrequency: .workItem, target: nil)

    lazy var displayLink: CADisplayLink = CADisplayLink(target: self, selector: #selector(displayLinkDidRefresh(link:)))
    var player: AVPlayer
    var videoOutput: AVPlayerItemVideoOutput

    @Binding var url: URL?

    required init?(coder: NSCoder) {
    fatalError("init(coder:) has not been implemented")
    }

    init(frame: CGRect, url: Binding<URL?>) {
    _url = url
    // Setup the player
    player = AVPlayer(url: url.wrappedValue!)

    let settings = [ String(kCVPixelBufferPixelFormatTypeKey): kCVPixelFormatType_32BGRA ]
    let output = AVPlayerItemVideoOutput(pixelBufferAttributes: settings)
    self.videoOutput = output

    super.init(frame: frame)

    playerLayer.player = player
    playerLayer.videoGravity = .resizeAspect
    layer.addSublayer(playerLayer)

    attachVideoComposition()

    player.currentItem?.add(output)
    displayLink.add(to: .main, forMode: .common)

    // Start the movie
    player.play()
    }

    private func attachVideoComposition() {
    let blurFilter = CIFilter(name: "CIGaussianBlur")
    if let playerItem = player.currentItem {
    let asset = playerItem.asset

    playerItem.videoComposition = AVMutableVideoComposition(asset: asset) { (filteringRequest) in
    let source = filteringRequest.sourceImage
    blurFilter?.setValue(source, forKey: kCIInputImageKey)

    // Apply CoreImage provessing here

    filteringRequest.finish(with: blurFilter?.outputImage ?? source, context: nil)
    }
    }
    }

    override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
    if let item = object as? AVPlayerItem {
    if item.status == AVPlayerItem.Status.readyToPlay && item.outputs.count == 0 {

    }
    }
    }

    override func layoutSubviews() {
    super.layoutSubviews()
    playerLayer.frame = bounds
    }

    @objc func displayLinkDidRefresh(link: CADisplayLink) {
    let itemTime = player.currentTime()
    if videoOutput.hasNewPixelBuffer(forItemTime: itemTime) {
    var presentationItemTime: CMTime = .zero
    if let pixelBuffer = videoOutput.copyPixelBuffer(forItemTime: itemTime, itemTimeForDisplay: &presentationItemTime) {

    // process the pixelbuffer here
    print(pixelBuffer)
    }
    }
    }
    }

    struct ImagePicker : UIViewControllerRepresentable {
    @Binding var isShown : Bool
    @Binding var image : Image?
    @Binding var url : URL?

    func updateUIViewController(_ uiViewController: UIImagePickerController, context: UIViewControllerRepresentableContext<ImagePicker>) {
    //Update UIViewcontrolleer Method
    }

    func makeCoordinator() -> ImagePickerCoordinator{
    ImagePickerCoordinator(isShown: $isShown, image: $image, url: $url)
    }

    func makeUIViewController(context: UIViewControllerRepresentableContext<ImagePicker>) -> UIImagePickerController {

    let picker = UIImagePickerController()
    picker.sourceType = .photoLibrary
    picker.delegate = context.coordinator
    picker.mediaTypes = ["public.movie"]
    picker.videoQuality = .typeHigh
    return picker
    }
    }

    class ImagePickerCoordinator : NSObject, UINavigationControllerDelegate, UIImagePickerControllerDelegate {

    @Binding var isShown : Bool
    @Binding var image : Image?
    @Binding var url: URL?

    init(isShown : Binding<Bool>, image: Binding<Image?>, url: Binding<URL?>) {
    _isShown = isShown
    _image = image
    _url = url
    }

    func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {

    url = info[UIImagePickerController.InfoKey.mediaURL] as? URL

    isShown = false
    }

    func imagePickerControllerDidCancel(_ picker: UIImagePickerController) {
    isShown = false
    }
    }

    struct PhotoCaptureView: View {
    @Binding var showImagePicker : Bool
    @Binding var image : Image?
    @Binding var url : URL?

    var body: some View {
    ImagePicker(isShown: $showImagePicker, image: $image, url: $url)
    }
    }

    struct ContentView_Previews: PreviewProvider {
    static var previews: some View {
    CustomCameraPhotoView()
    }
    }

    原始答案:代表总是(?)被定义为弱成员。请参阅documentation 。在调用委托(delegate)之前,您的 PlayerOutput 对象将退出作用域。使某个对象的 PlayerOutput 成员在播放期间处于事件状态,并且您的代码应该按原样工作。

    关于ios - Swift - 使用 AVPlayerItemVideoOutput 访问解码帧 : outputMediaDataWillChange is not called,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/62537172/

    27 4 0
    Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
    广告合作:1813099741@qq.com 6ren.com