- html - 出于某种原因,IE8 对我的 Sass 文件中继承的 html5 CSS 不友好?
- JMeter 在响应断言中使用 span 标签的问题
- html - 在 :hover and :active? 上具有不同效果的 CSS 动画
- html - 相对于居中的 html 内容固定的 CSS 重复背景?
我有一个应用程序可以播放从用户库中选择的视频。我希望应用程序最终能够做到的是将叠加渲染到视频上(在播放时),然后将结果输出到新的媒体文件。为此,我需要捕获解码的帧,以便在视频播放结束后渲染此叠加并输出到文件。
这是我使用 AVFoundation 的第一个应用程序,我花了一两天时间试图通过谷歌和 Apple 文档找出如何实现这一目标,我认为我在 AVPlayerItemVideoOutput
对象中有一些东西。但是,委托(delegate)回调永远不会执行。
我发现 AVPlayerItemVideoOutput
必须在 AVPlayerItem
处于 readyToPlay
状态之后创建。因此,在我的 PlayerUIView
的初始化程序中,我向 AVPlayerItem 添加了一个观察者以观察其状态。
init(frame: CGRect, url: Binding<URL?>) {
_url = url
// Setup the player
player = AVPlayer(url: url.wrappedValue!)
super.init(frame: frame)
playerLayer.player = player
playerLayer.videoGravity = .resizeAspect
layer.addSublayer(playerLayer)
//displayLink = CADisplayLink()
// Setup looping
player.actionAtItemEnd = .none
NotificationCenter.default.addObserver(self,
selector: #selector(playerItemDidReachEnd(notification:)),
name: .AVPlayerItemDidPlayToEndTime,
object: player.currentItem)
player.currentItem?.addObserver(self, forKeyPath: #keyPath(AVPlayerItem.status), options: [.old, .new], context: nil)
// Start the movie
player.play()
}
我在中间创建了一个 CADisplayLink - 注释掉了 - 因为我看到它可以以某种方式用于此目的,但不完全确定它应该如何或做什么。还从名称来看,它从显示的视频中获取帧,而不是从实际解码的视频帧中获取帧,这正是我想要的。
当状态第一次设置为 readyToPlay
时,我创建并添加 AVPlayerItemVideoOutput
。
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if let item = object as? AVPlayerItem {
if item.status == AVPlayerItem.Status.readyToPlay && item.outputs.count == 0 {
let settings = [ String(kCVPixelBufferPixelFormatTypeKey): kCVPixelFormatType_24RGB ]
let output = AVPlayerItemVideoOutput(pixelBufferAttributes: settings)
output.setDelegate(PlayerOutput(output: output), queue: DispatchQueue(label: ""))
player.currentItem?.add(output)
}
}
}
在委托(delegate)PlayerOutput
上,我希望在新帧可用时收到通知。此时我将访问 AVPlayerItemVideoOutput
对象来访问像素缓冲区。
class PlayerOutput : NSObject, AVPlayerItemOutputPullDelegate {
func outputMediaDataWillChange(_ sender: AVPlayerItemOutput) {
let videoOutput = sender as! AVPlayerItemVideoOutput
let newPixelBuff = videoOutput.hasNewPixelBuffer(forItemTime: CMTime(seconds: 1, preferredTimescale: 10))
}
}
但是,这个回调永远不会发生。我在代码中设置了断点,但从未命中。根据 AVFoundation 中其他地方的命名和类似代码,我假设每个新帧都会命中它,这样我就可以访问缓冲区中的帧,但我没有看到任何情况发生。我是否遗漏或做错了什么?
我有一种感觉,我不太正确地使用/理解这些类以及它们的用途,但它在命名法上与我已成功实现的 AVCaptureVideoDataOutput 等类相似在应用程序的其他地方,它们的工作方式似乎不太一样。很难找到任何例子来实现我想要用 AVPlayer 做的事情。
编辑:当前代码的工作示例:
import SwiftUI
import AVFoundation
struct CustomCameraPhotoView: View {
@State private var image: Image?
@State private var showingCustomCamera = false
@State private var showImagePicker = false
@State private var inputImage: UIImage?
@State private var url: URL?
var body: some View {
ZStack {
if url != nil
{
PlayerView(url: $url)
}
else
{
Button(action: {
self.showImagePicker = true
}) {
Text("Select a Video").foregroundColor(.white).font(.headline)
}
}
}.edgesIgnoringSafeArea(.all)
.sheet(isPresented: $showImagePicker,
onDismiss: loadImage) {
PhotoCaptureView(showImagePicker: self.$showImagePicker, image: self.$image, url: self.$url)
}.edgesIgnoringSafeArea(.leading).edgesIgnoringSafeArea(.trailing)
}
func loadImage() {
guard let inputImage = inputImage else { return }
image = Image(uiImage: inputImage)
}
}
struct PlayerView: UIViewControllerRepresentable {
@Binding var url: URL?
func updateUIViewController(_ uiView: UIViewController, context: UIViewControllerRepresentableContext<PlayerView>) {
}
func makeCoordinator() -> PlayerCoordinator{
//Make Coordinator which will commnicate with the ImagePickerViewController
PlayerCoordinator()
}
func makeUIViewController(context: Context) -> UIViewController {
let view = PlayerUIView(frame: .zero, url: $url)
let controller = PlayerController()
controller.view = view
return controller
}
}
class PlayerCoordinator : NSObject, UINavigationControllerDelegate {
}
class PlayerController: UIViewController {
override var shouldAutorotate: Bool {
return false
}
override var supportedInterfaceOrientations: UIInterfaceOrientationMask {
return .all
}
}
class PlayerUIView: UIView {
private let playerLayer = AVPlayerLayer()
private var playerOutput = PlayerOutput()
private let _myVideoOutputQueue = DispatchQueue(label: "VideoFrames", qos: .background, attributes: .concurrent, autoreleaseFrequency: .workItem, target: nil)
var displayLink: CADisplayLink?
var player: AVPlayer
@Binding var url: URL?
required init?(coder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
init(frame: CGRect, url: Binding<URL?>) {
_url = url
// Setup the player
player = AVPlayer(url: url.wrappedValue!)
super.init(frame: frame)
playerLayer.player = player
playerLayer.videoGravity = .resizeAspect
layer.addSublayer(playerLayer)
let settings = [ String(kCVPixelBufferPixelFormatTypeKey): kCVPixelFormatType_32BGRA ]
let output = AVPlayerItemVideoOutput(pixelBufferAttributes: settings)
output.setDelegate(self.playerOutput, queue: self._myVideoOutputQueue)
player.currentItem?.add(output)
//displayLink = CADisplayLink()
// Setup looping
player.actionAtItemEnd = .none
NotificationCenter.default.addObserver(self,
selector: #selector(playerItemDidReachEnd(notification:)),
name: .AVPlayerItemDidPlayToEndTime,
object: player.currentItem)
player.currentItem?.addObserver(self, forKeyPath: #keyPath(AVPlayerItem.status), options: [.old, .new], context: nil)
// Start the movie
player.play()
}
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if let item = object as? AVPlayerItem {
if item.status == AVPlayerItem.Status.readyToPlay && item.outputs.count == 0 {
}
}
}
@objc
func playerItemDidReachEnd(notification: Notification) {
self.url = nil
}
override func layoutSubviews() {
super.layoutSubviews()
playerLayer.frame = bounds
}
class PlayerOutput : NSObject, AVPlayerItemOutputPullDelegate {
func outputMediaDataWillChange(_ sender: AVPlayerItemOutput) {
let videoOutput = sender as! AVPlayerItemVideoOutput
let newPixelBuff = videoOutput.hasNewPixelBuffer(forItemTime: CMTime(seconds: 1, preferredTimescale: 10))
}
}
}
struct ImagePicker : UIViewControllerRepresentable {
@Binding var isShown : Bool
@Binding var image : Image?
@Binding var url : URL?
func updateUIViewController(_ uiViewController: UIImagePickerController, context: UIViewControllerRepresentableContext<ImagePicker>)
{
//Update UIViewcontrolleer Method
}
func makeCoordinator() -> ImagePickerCoordinator{
//Make Coordinator which will commnicate with the ImagePickerViewController
ImagePickerCoordinator(isShown: $isShown, image: $image, url: $url)
}
func makeUIViewController(context: UIViewControllerRepresentableContext<ImagePicker>) -> UIImagePickerController
{
let picker = UIImagePickerController()
picker.sourceType = .photoLibrary
picker.delegate = context.coordinator
picker.mediaTypes = ["public.movie"]
picker.videoQuality = .typeHigh
return picker
}
}
class ImagePickerCoordinator : NSObject, UINavigationControllerDelegate, UIImagePickerControllerDelegate{
@Binding var isShown : Bool
@Binding var image : Image?
@Binding var url: URL?
init(isShown : Binding<Bool>, image: Binding<Image?>, url: Binding<URL?>) {
_isShown = isShown
_image = image
_url = url
}
//Selected Image
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {
let uiImage = info[UIImagePickerController.InfoKey.mediaURL] as! URL
url = uiImage
//image = Image(uiImage: uiImage)
isShown = false
}
//Image selection got cancel
func imagePickerControllerDidCancel(_ picker: UIImagePickerController) {
isShown = false
}
}
struct PhotoCaptureView: View {
@Binding var showImagePicker : Bool
@Binding var image : Image?
@Binding var url : URL?
var body: some View {
ImagePicker(isShown: $showImagePicker, image: $image, url: $url)
}
}
struct ContentView_Previews: PreviewProvider {
static var previews: some View {
CustomCameraPhotoView()
}
}
最佳答案
编辑:您怀疑 outputMediaDataWillChange
与 AVCaptureVideoDataOutput
是完全不同的野兽,这是正确的。它不是每帧调用的,而是指示一些玩家状态的变化(我有点不清楚到底是什么情况)。有(至少)两个选项用于读取和修改视频输出。
videoComposition
添加到当前 playerItem
中,您可以在其中使用 CoreImage 工具来调整视频(例如添加叠加层)。在此示例中,我对输出应用了一个简单的模糊滤镜,但 CoreImage 允许您执行更复杂的操作。let blurFilter = CIFilter(name: "CIGaussianBlur")
if let playerItem = player.currentItem {
let asset = playerItem.asset
playerItem.videoComposition = AVMutableVideoComposition(asset: asset) { (filteringRequest) in
let source = filteringRequest.sourceImage
blurFilter?.setValue(source, forKey: kCIInputImageKey)
filteringRequest.finish(with: blurFilter?.outputImage ?? source, context: nil)
}
}
CADisplayLink
就是您的正确选择。显示链接允许您将操作与显示刷新率同步。您可以通过以下方式从视频输出中抓取帧:lazy var displayLink: CADisplayLink = CADisplayLink(target: self, selector: #selector(displayLinkDidRefresh(link:)))
init(frame: CGRect, url: Binding<URL?>) {
...
// activate the displayLink
displayLink.add(to: .main, forMode: .common)
...
}
@objc func displayLinkDidRefresh(link: CADisplayLink) {
guard let videoOutput = self.videoOutput else { return }
let itemTime = player.currentTime()
if videoOutput.hasNewPixelBuffer(forItemTime: itemTime) {
var presentationItemTime: CMTime = .zero
if let pixelBuffer = videoOutput.copyPixelBuffer(forItemTime: itemTime, itemTimeForDisplay: &presentationItemTime) {
// process the pixelbuffer here
}
}
}
完整的最小示例:
import SwiftUI
import AVFoundation
struct CustomCameraPhotoView: View {
@State private var image: Image?
@State private var showingCustomCamera = false
@State private var showImagePicker = false
@State private var inputImage: UIImage?
@State private var url: URL?
var body: some View {
ZStack {
if url != nil {
PlayerView(url: $url)
} else {
Button(action: {
self.showImagePicker = true
}) {
Text("Select a Video")
.font(.headline)
}
}
} .edgesIgnoringSafeArea(.all)
.sheet(isPresented: $showImagePicker,
onDismiss: loadImage) {
PhotoCaptureView(showImagePicker: self.$showImagePicker, image: self.$image, url: self.$url)
}
.edgesIgnoringSafeArea(.leading).edgesIgnoringSafeArea(.trailing)
}
func loadImage() {
guard let inputImage = inputImage else { return }
image = Image(uiImage: inputImage)
}
}
struct PlayerView: UIViewControllerRepresentable {
@Binding var url: URL?
func updateUIViewController(_ uiView: UIViewController, context: UIViewControllerRepresentableContext<PlayerView>) {
}
func makeUIViewController(context: Context) -> UIViewController {
let view = PlayerUIView(frame: .zero, url: $url)
let controller = PlayerController()
controller.view = view
return controller
}
}
class PlayerController: UIViewController {
override var shouldAutorotate: Bool { false }
override var supportedInterfaceOrientations: UIInterfaceOrientationMask { .all }
}
class PlayerUIView: UIView {
private let playerLayer = AVPlayerLayer()
private let _myVideoOutputQueue = DispatchQueue(label: "VideoFrames", qos: .background, attributes: .concurrent, autoreleaseFrequency: .workItem, target: nil)
lazy var displayLink: CADisplayLink = CADisplayLink(target: self, selector: #selector(displayLinkDidRefresh(link:)))
var player: AVPlayer
var videoOutput: AVPlayerItemVideoOutput
@Binding var url: URL?
required init?(coder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
init(frame: CGRect, url: Binding<URL?>) {
_url = url
// Setup the player
player = AVPlayer(url: url.wrappedValue!)
let settings = [ String(kCVPixelBufferPixelFormatTypeKey): kCVPixelFormatType_32BGRA ]
let output = AVPlayerItemVideoOutput(pixelBufferAttributes: settings)
self.videoOutput = output
super.init(frame: frame)
playerLayer.player = player
playerLayer.videoGravity = .resizeAspect
layer.addSublayer(playerLayer)
attachVideoComposition()
player.currentItem?.add(output)
displayLink.add(to: .main, forMode: .common)
// Start the movie
player.play()
}
private func attachVideoComposition() {
let blurFilter = CIFilter(name: "CIGaussianBlur")
if let playerItem = player.currentItem {
let asset = playerItem.asset
playerItem.videoComposition = AVMutableVideoComposition(asset: asset) { (filteringRequest) in
let source = filteringRequest.sourceImage
blurFilter?.setValue(source, forKey: kCIInputImageKey)
// Apply CoreImage provessing here
filteringRequest.finish(with: blurFilter?.outputImage ?? source, context: nil)
}
}
}
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if let item = object as? AVPlayerItem {
if item.status == AVPlayerItem.Status.readyToPlay && item.outputs.count == 0 {
}
}
}
override func layoutSubviews() {
super.layoutSubviews()
playerLayer.frame = bounds
}
@objc func displayLinkDidRefresh(link: CADisplayLink) {
let itemTime = player.currentTime()
if videoOutput.hasNewPixelBuffer(forItemTime: itemTime) {
var presentationItemTime: CMTime = .zero
if let pixelBuffer = videoOutput.copyPixelBuffer(forItemTime: itemTime, itemTimeForDisplay: &presentationItemTime) {
// process the pixelbuffer here
print(pixelBuffer)
}
}
}
}
struct ImagePicker : UIViewControllerRepresentable {
@Binding var isShown : Bool
@Binding var image : Image?
@Binding var url : URL?
func updateUIViewController(_ uiViewController: UIImagePickerController, context: UIViewControllerRepresentableContext<ImagePicker>) {
//Update UIViewcontrolleer Method
}
func makeCoordinator() -> ImagePickerCoordinator{
ImagePickerCoordinator(isShown: $isShown, image: $image, url: $url)
}
func makeUIViewController(context: UIViewControllerRepresentableContext<ImagePicker>) -> UIImagePickerController {
let picker = UIImagePickerController()
picker.sourceType = .photoLibrary
picker.delegate = context.coordinator
picker.mediaTypes = ["public.movie"]
picker.videoQuality = .typeHigh
return picker
}
}
class ImagePickerCoordinator : NSObject, UINavigationControllerDelegate, UIImagePickerControllerDelegate {
@Binding var isShown : Bool
@Binding var image : Image?
@Binding var url: URL?
init(isShown : Binding<Bool>, image: Binding<Image?>, url: Binding<URL?>) {
_isShown = isShown
_image = image
_url = url
}
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {
url = info[UIImagePickerController.InfoKey.mediaURL] as? URL
isShown = false
}
func imagePickerControllerDidCancel(_ picker: UIImagePickerController) {
isShown = false
}
}
struct PhotoCaptureView: View {
@Binding var showImagePicker : Bool
@Binding var image : Image?
@Binding var url : URL?
var body: some View {
ImagePicker(isShown: $showImagePicker, image: $image, url: $url)
}
}
struct ContentView_Previews: PreviewProvider {
static var previews: some View {
CustomCameraPhotoView()
}
}
原始答案:代表总是(?)被定义为弱成员。请参阅documentation 。在调用委托(delegate)之前,您的 PlayerOutput 对象将退出作用域。使某个对象的 PlayerOutput
成员在播放期间处于事件状态,并且您的代码应该按原样工作。
关于ios - Swift - 使用 AVPlayerItemVideoOutput 访问解码帧 : outputMediaDataWillChange is not called,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/62537172/
关闭。这个问题是opinion-based 。目前不接受答案。 想要改进这个问题吗?更新问题,以便 editing this post 可以用事实和引文来回答它。 . 已关闭 4 年前。 Improv
PowerShell Web Access 允许您通过 Web 浏览器运行 PowerShell cmdlet。它显示了一个基于 Web 的控制台窗口。 有没有办法运行 cmdlet 而无需在控制台窗
我尝试在无需用户登录的情况下访问 Sharepoint 文件。 我可以通过以下任一方式获取访问 token 方法一: var client = new RestClient("https://logi
我目前正在尝试通过 Chrome 扩展程序访问 Google 服务。我的理解是,对于 JS 应用程序,Google 首选的身份验证机制是 OAuth。我的应用目前已成功通过 OAuth 向服务进行身份
假设我有纯抽象类 IHandler 和派生自它的类: class IHandler { public: virtual int process_input(char input) = 0; };
我有一个带有 ThymeLeaf 和 Dojo 的 Spring 应用程序,这给我带来了问题。当我从我的 HTML 文件中引用 CSS 文件时,它们在 Firebug 中显示为中止。但是,当我通过在地
这个问题已经有答案了: JavaScript property access: dot notation vs. brackets? (17 个回答) 已关闭 6 年前。 为什么这不起作用? func
我想将所有流量重定向到 https,只有 robot.txt 应该可以通过 http 访问。 是否可以为 robot.txt 文件创建异常(exception)? 我的 .htaccess 文件: R
我遇到了 LinkedIn OAuth2: "Unable to verify access token" 中描述的相同问题;但是,那里描述的解决方案并不能解决我的问题。 我能够成功请求访问 toke
问题 我有一个暴露给 *:8080 的 Docker 服务容器. 我无法通过 localhost:8080 访问容器. Chrome /curl无限期挂断。 但是如果我使用任何其他本地IP,我就可以访
我正在使用 Google 的 Oauth 2.0 来获取用户的 access_token,但我不知道如何将它与 imaplib 一起使用来访问收件箱。 最佳答案 下面是带有 oauth 2.0 的 I
我正在做 docker 入门指南:https://docs.docker.com/get-started/part3/#recap-and-cheat-sheet-optional docker-co
我正在尝试使用静态 IP 在 AKS 上创建一个 Web 应用程序,自然找到了一个带有 Nginx ingress controller in Azure's documentation 的解决方案。
这是我在名为 foo.js 的文件中的代码。 console.log('module.exports:', module.exports) console.log('module.id:', modu
我试图理解访问键。我读过https://docs.aws.amazon.com/general/latest/gr/aws-sec-cred-types.html#access-keys-and-se
我正在使用 MGTwitterEngine"将 twitter 集成到我的应用程序中。它在 iOS 4.2 上运行良好。当我尝试从任何 iOS 5 设备访问 twitter 时,我遇到了身份验证 to
我试图理解访问键。我读过https://docs.aws.amazon.com/general/latest/gr/aws-sec-cred-types.html#access-keys-and-se
我正在使用以下 API 列出我的 Facebook 好友。 https://graph.facebook.com/me/friends?access_token= ??? 我想知道访问 token 过
401 Unauthorized - Show headers - { "error": { "errors": [ { "domain": "global", "reas
我已经将我的 django 应用程序部署到 heroku 并使用 Amazon s3 存储桶存储静态文件,我发现从 s3 存储桶到 heroku 获取数据没有问题。但是,当我测试查看内容存储位置时,除
我是一名优秀的程序员,十分优秀!