gpt4 book ai didi

ios - ARKit 中的 ChromaKey 视频

转载 作者:行者123 更新时间:2023-11-28 06:00:32 24 4
gpt4 key购买 nike

我正在尝试在 ARKit 中对视频进行色度键控,我所做的与@Felix 在这里所做的非常相似:GPUImageView inside SKScene as SKNode material - Playing transparent video on ARKit

但是,当视频应该出现时(在这种情况下,当检测到 AR 引用图像时)我得到一个 [SceneKit] Error: Cannot get pixel buffer (CVPixelBufferRef) 错误和视频不再播放。它在我实现 chromaKeyMaterial 之前就开始播放了。这是我的代码,从检测到 AR 引用图像之后开始:

DispatchQueue.main.async {
let filePath = Bundle.main.path(forResource: "wigz", ofType: "mp4")
let videoURL = NSURL(fileURLWithPath: filePath!)
let player = AVPlayer(url: videoURL as URL)

let spriteKitScene = SKScene(size: CGSize(width: 640, height: 480))
let videoSpriteKitNode = SKVideoNode(avPlayer: player)
let videoNode = SCNNode()
videoNode.geometry = SCNPlane(width: imageAnchor.referenceImage.physicalSize.width,
height: imageAnchor.referenceImage.physicalSize.height)
videoNode.eulerAngles = SCNVector3(-Float.pi/2, 0, 0)

// Use spritekit with videonode inside
spriteKitScene.scaleMode = .aspectFit
videoSpriteKitNode.position = CGPoint(x: spriteKitScene.size.width / 2,
y: spriteKitScene.size.height / 2)
videoSpriteKitNode.size = spriteKitScene.size
videoSpriteKitNode.yScale = -1.0
videoSpriteKitNode.play()

// Loop video
NotificationCenter.default.addObserver(forName: .AVPlayerItemDidPlayToEndTime, object: player.currentItem, queue: .main) { _ in
player.seek(to: kCMTimeZero)
player.play()
}

spriteKitScene.addChild(videoSpriteKitNode)

videoNode.geometry?.firstMaterial?.diffuse.contents = spriteKitScene
videoNode.geometry?.firstMaterial?.isDoubleSided = true
let chromaKeyMaterial = ChromaKeyMaterial()
chromaKeyMaterial.diffuse.contents = player
videoNode.geometry!.materials = [chromaKeyMaterial]

node.addChildNode(videoNode)

self.imageDetectView.scene.rootNode.addChildNode(node)
}

在 ChromaKeyMaterial.swift 文件中,我将这些行更改为:

float maskY = 0.0 * c_colorToReplace.r + 1.0 * c_colorToReplace.g + 0.0 * c_colorToReplace.b;
float maskCr = 0.7132 * (c_colorToReplace.r - maskY);
float maskCb = 0.5647 * (c_colorToReplace.b - maskY);

float Y = 0.0 * textureColor.r + 1.0 * textureColor.g + 0.0 * textureColor.b;
float Cr = 0.7132 * (textureColor.r - Y);
float Cb = 0.5647 * (textureColor.b - Y);

努力将色度抠出纯绿色,但我不确定这是否是正确的方法。

如有任何帮助,我们将不胜感激!

最佳答案

想通了。我将我的颜色设置为错误地抠出(甚至在错误的位置 facepalm)并且似乎有一个错误阻止视频播放,除非你稍微延迟一下。该错误应该已修复,但似乎并非如此。

如果有人感兴趣,这是我更正和清理的代码(编辑以包括来自@mnuages 的提示):

// Get Video URL and create AV Player
let filePath = Bundle.main.path(forResource: "VIDEO_FILE_NAME", ofType: "VIDEO_FILE_EXTENSION")
let videoURL = NSURL(fileURLWithPath: filePath!)
let player = AVPlayer(url: videoURL as URL)

// Create SceneKit videoNode to hold the spritekit scene.
let videoNode = SCNNode()

// Set geometry of the SceneKit node to be a plane, and rotate it to be flat with the image
videoNode.geometry = SCNPlane(width: imageAnchor.referenceImage.physicalSize.width,
height: imageAnchor.referenceImage.physicalSize.height)
videoNode.eulerAngles = SCNVector3(-Float.pi/2, 0, 0)

//Set the video AVPlayer as the contents of the video node's material.
videoNode.geometry?.firstMaterial?.diffuse.contents = player
videoNode.geometry?.firstMaterial?.isDoubleSided = true

// Alpha transparancy stuff
let chromaKeyMaterial = ChromaKeyMaterial()
chromaKeyMaterial.diffuse.contents = player
videoNode.geometry!.materials = [chromaKeyMaterial]

//video does not start without delaying the player
//playing the video before just results in [SceneKit] Error: Cannot get pixel buffer (CVPixelBufferRef)
DispatchQueue.main.asyncAfter(deadline: .now() + 0.001) {
player.seek(to:CMTimeMakeWithSeconds(1, 1000))
player.play()
}
// Loop video
NotificationCenter.default.addObserver(forName: .AVPlayerItemDidPlayToEndTime, object: player.currentItem, queue: .main) { _ in
player.seek(to: kCMTimeZero)
player.play()
}

// Add videoNode to ARAnchor
node.addChildNode(videoNode)

// Add ARAnchor node to the root node of the scene
self.imageDetectView.scene.rootNode.addChildNode(node)

这是 chrome key Material

import SceneKit

public class ChromaKeyMaterial: SCNMaterial {

public var backgroundColor: UIColor {
didSet { didSetBackgroundColor() }
}

public var thresholdSensitivity: Float {
didSet { didSetThresholdSensitivity() }
}

public var smoothing: Float {
didSet { didSetSmoothing() }
}

public init(backgroundColor: UIColor = .green, thresholdSensitivity: Float = 0.50, smoothing: Float = 0.001) {

self.backgroundColor = backgroundColor
self.thresholdSensitivity = thresholdSensitivity
self.smoothing = smoothing

super.init()

didSetBackgroundColor()
didSetThresholdSensitivity()
didSetSmoothing()

// chroma key shader is based on GPUImage
// https://github.com/BradLarson/GPUImage/blob/master/framework/Source/GPUImageChromaKeyFilter.m

let surfaceShader =
"""
uniform vec3 c_colorToReplace;
uniform float c_thresholdSensitivity;
uniform float c_smoothing;

#pragma transparent
#pragma body

vec3 textureColor = _surface.diffuse.rgb;

float maskY = 0.2989 * c_colorToReplace.r + 0.5866 * c_colorToReplace.g + 0.1145 * c_colorToReplace.b;
float maskCr = 0.7132 * (c_colorToReplace.r - maskY);
float maskCb = 0.5647 * (c_colorToReplace.b - maskY);

float Y = 0.2989 * textureColor.r + 0.5866 * textureColor.g + 0.1145 * textureColor.b;
float Cr = 0.7132 * (textureColor.r - Y);
float Cb = 0.5647 * (textureColor.b - Y);

float blendValue = smoothstep(c_thresholdSensitivity, c_thresholdSensitivity + c_smoothing, distance(vec2(Cr, Cb), vec2(maskCr, maskCb)));

float a = blendValue;
_surface.transparent.a = a;
"""

//_surface.transparent.a = a;

shaderModifiers = [
.surface: surfaceShader,
]
}

required public init?(coder aDecoder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}

//setting background color to be keyed out
private func didSetBackgroundColor() {
//getting pixel from background color
//let rgb = backgroundColor.cgColor.components!.map{Float($0)}
//let vector = SCNVector3(x: rgb[0], y: rgb[1], z: rgb[2])
let vector = SCNVector3(x: 0.0, y: 1.0, z: 0.0)
setValue(vector, forKey: "c_colorToReplace")
}

private func didSetSmoothing() {
setValue(smoothing, forKey: "c_smoothing")
}

private func didSetThresholdSensitivity() {
setValue(thresholdSensitivity, forKey: "c_thresholdSensitivity")
}
}

关于ios - ARKit 中的 ChromaKey 视频,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/49960262/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com