gpt4 book ai didi

ios - ARKit/SpriteKit - 将 pixelBufferAttributes 设置为 SKVideoNode 或以另一种方式在视频中制作透明像素(色度键效果)

转载 作者:IT王子 更新时间:2023-10-29 05:52:16 28 4
gpt4 key购买 nike

我的目标是使用 ARKit 在真实环境中呈现 2D 动画角色。动画角色是视频的一部分,在视频的以下快照中显示:

Snapshot from the video

使用代码可以毫无问题地显示视频本身:

func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode? {
guard let urlString = Bundle.main.path(forResource: "resourceName", ofType: "mp4") else { return nil }

let url = URL(fileURLWithPath: urlString)
let asset = AVAsset(url: url)
let item = AVPlayerItem(asset: asset)
let player = AVPlayer(playerItem: item)

let videoNode = SKVideoNode(avPlayer: player)
videoNode.size = CGSize(width: 200.0, height: 150.0)
videoNode.anchorPoint = CGPoint(x: 0.5, y: 0.0)

return videoNode
}

此代码的结果按预期显示在下面应用程序的屏幕截图中:

App screenshot #1

但是正如你所看到的,角色的背景不是很好,所以我需要让它消失,以创造角色实际上站在水平面上的错觉。我试图通过对视频制作色键效果来实现这一点。

  • 对于那些不熟悉色度键的人来说,这是有时在电视上看到的使颜色透明的“绿屏效果”的名称。

我的色度键效果方法是创建一个基于 "CIColorCube"CIFilter 的自定义滤镜,然后使用 AVVideoComposition 将滤镜应用于视频。

首先是创建过滤器的代码:

func RGBtoHSV(r : Float, g : Float, b : Float) -> (h : Float, s : Float, v : Float) {
var h : CGFloat = 0
var s : CGFloat = 0
var v : CGFloat = 0
let col = UIColor(red: CGFloat(r), green: CGFloat(g), blue: CGFloat(b), alpha: 1.0)
col.getHue(&h, saturation: &s, brightness: &v, alpha: nil)
return (Float(h), Float(s), Float(v))
}

func colorCubeFilterForChromaKey(hueAngle: Float) -> CIFilter {

let hueRange: Float = 20 // degrees size pie shape that we want to replace
let minHueAngle: Float = (hueAngle - hueRange/2.0) / 360
let maxHueAngle: Float = (hueAngle + hueRange/2.0) / 360

let size = 64
var cubeData = [Float](repeating: 0, count: size * size * size * 4)
var rgb: [Float] = [0, 0, 0]
var hsv: (h : Float, s : Float, v : Float)
var offset = 0

for z in 0 ..< size {
rgb[2] = Float(z) / Float(size) // blue value
for y in 0 ..< size {
rgb[1] = Float(y) / Float(size) // green value
for x in 0 ..< size {

rgb[0] = Float(x) / Float(size) // red value
hsv = RGBtoHSV(r: rgb[0], g: rgb[1], b: rgb[2])
// TODO: Check if hsv.s > 0.5 is really nesseccary
let alpha: Float = (hsv.h > minHueAngle && hsv.h < maxHueAngle && hsv.s > 0.5) ? 0 : 1.0

cubeData[offset] = rgb[0] * alpha
cubeData[offset + 1] = rgb[1] * alpha
cubeData[offset + 2] = rgb[2] * alpha
cubeData[offset + 3] = alpha
offset += 4
}
}
}
let b = cubeData.withUnsafeBufferPointer { Data(buffer: $0) }
let data = b as NSData

let colorCube = CIFilter(name: "CIColorCube", withInputParameters: [
"inputCubeDimension": size,
"inputCubeData": data
])
return colorCube!
}

然后通过修改我之前编写的函数 func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode? 将滤镜应用于视频的代码:

func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode? {
guard let urlString = Bundle.main.path(forResource: "resourceName", ofType: "mp4") else { return nil }

let url = URL(fileURLWithPath: urlString)
let asset = AVAsset(url: url)

let filter = colorCubeFilterForChromaKey(hueAngle: 38)
let composition = AVVideoComposition(asset: asset, applyingCIFiltersWithHandler: { request in
let source = request.sourceImage
filter.setValue(source, forKey: kCIInputImageKey)
let output = filter.outputImage

request.finish(with: output!, context: nil)
})

let item = AVPlayerItem(asset: asset)
item.videoComposition = composition
let player = AVPlayer(playerItem: item)

let videoNode = SKVideoNode(avPlayer: player)
videoNode.size = CGSize(width: 200.0, height: 150.0)
videoNode.anchorPoint = CGPoint(x: 0.5, y: 0.0)

return videoNode
}

如果像素颜色与背景的色调范围相匹配,代码应该将视频每一帧的所有像素替换为 alpha = 0.0。但是,我没有获得透明像素,而是获得了黑色像素,如下图所示:

App screenshot #2

现在,尽管这不是想要的效果,但我并不感到惊讶,因为我知道这是 iOS 显示带有 alpha channel 的视频的方式。但这是真正的问题 - 当在 AVPlayer 中显示普通视频时,可以选择将 AVPlayerLayer 添加到 View ,并设置 pixelBufferAttributes 到它,让播放器层知道我们使用透明像素缓冲区,如下所示:

let playerLayer = AVPlayerLayer(player: player)
playerLayer.bounds = view.bounds
playerLayer.position = view.center
playerLayer.pixelBufferAttributes = [(kCVPixelBufferPixelFormatTypeKey as String): kCVPixelFormatType_32BGRA]
view.layer.addSublayer(playerLayer)

此代码为我们提供了一个具有透明背景(GOOD!)但固定大小和位置(NOT GOOD...)的视频,正如您在此看到的截图:

App screenshot #3

我想实现相同的效果,但在 SKVideoNode 上,而不是在 AVPlayerLayer 上。但是,我找不到任何方法来将 pixelBufferAttributes 设置为 SKVideoNode,并且设置播放器层并没有达到 ARKit 的预期效果,因为位置固定。

我的问题是否有任何解决方案,或者是否有另一种技术可以达到相同的预期效果?

最佳答案

解决方法很简单!需要做的就是将视频添加为 SKEffectNode 的子节点,并将过滤器应用于 SKEffectNode 而不是视频本身(AVVideoComposition 不是必需的)。这是我使用的代码:

func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode? {
// Create and configure a node for the anchor added to the view's session.
let bialikVideoNode = videoNodeWith(resourceName: "Tsina_05", ofType: "mp4")
bialikVideoNode.size = CGSize(width: kDizengofVideoWidth, height: kDizengofVideoHeight)
bialikVideoNode.anchorPoint = CGPoint(x: 0.5, y: 0.0)

// Make the video background transparent using an SKEffectNode, since chroma-key doesn't work on video
let effectNode = SKEffectNode()
effectNode.addChild(bialikVideoNode)
effectNode.filter = colorCubeFilterForChromaKey(hueAngle: 120)

return effectNode
}

这里是需要的结果: enter image description here

关于ios - ARKit/SpriteKit - 将 pixelBufferAttributes 设置为 SKVideoNode 或以另一种方式在视频中制作透明像素(色度键效果),我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/50139146/

28 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com