gpt4 book ai didi

ios - 实时将 Metal MTKView 捕获为电影?

转载 作者:技术小花猫 更新时间:2023-10-29 11:11:58 24 4
gpt4 key购买 nike

MTKView 捕获帧的最有效方法是什么?如果可能的话,我想实时保存帧中的 .mov 文件。是否可以渲染成 AVPlayer 帧或其他东西?

目前正在使用此代码绘制(基于@warrenm PerformanceShaders project):

func draw(in view: MTKView) {
_ = inflightSemaphore.wait(timeout: DispatchTime.distantFuture)
updateBuffers()

let commandBuffer = commandQueue.makeCommandBuffer()

commandBuffer.addCompletedHandler{ [weak self] commandBuffer in
if let strongSelf = self {
strongSelf.inflightSemaphore.signal()
}
}
// Dispatch the current kernel to perform the selected image filter
selectedKernel.encode(commandBuffer: commandBuffer,
sourceTexture: kernelSourceTexture!,
destinationTexture: kernelDestTexture!)

if let renderPassDescriptor = view.currentRenderPassDescriptor, let currentDrawable = view.currentDrawable
{
let clearColor = MTLClearColor(red: 0, green: 0, blue: 0, alpha: 1)
renderPassDescriptor.colorAttachments[0].clearColor = clearColor

let renderEncoder = commandBuffer.makeRenderCommandEncoder(descriptor: renderPassDescriptor)
renderEncoder.label = "Main pass"

renderEncoder.pushDebugGroup("Draw textured square")
renderEncoder.setFrontFacing(.counterClockwise)
renderEncoder.setCullMode(.back)

renderEncoder.setRenderPipelineState(pipelineState)
renderEncoder.setVertexBuffer(vertexBuffer, offset: MBEVertexDataSize * bufferIndex, at: 0)
renderEncoder.setVertexBuffer(uniformBuffer, offset: MBEUniformDataSize * bufferIndex , at: 1)
renderEncoder.setFragmentTexture(kernelDestTexture, at: 0)
renderEncoder.setFragmentSamplerState(sampler, at: 0)
renderEncoder.drawPrimitives(type: .triangleStrip, vertexStart: 0, vertexCount: 4)

renderEncoder.popDebugGroup()
renderEncoder.endEncoding()

commandBuffer.present(currentDrawable)
}

bufferIndex = (bufferIndex + 1) % MBEMaxInflightBuffers

commandBuffer.commit()
}

最佳答案

这是一个小类,它执行写出捕获 Metal View 内容的电影文件的基本功能:

class MetalVideoRecorder {
var isRecording = false
var recordingStartTime = TimeInterval(0)

private var assetWriter: AVAssetWriter
private var assetWriterVideoInput: AVAssetWriterInput
private var assetWriterPixelBufferInput: AVAssetWriterInputPixelBufferAdaptor

init?(outputURL url: URL, size: CGSize) {
do {
assetWriter = try AVAssetWriter(outputURL: url, fileType: .m4v)
} catch {
return nil
}

let outputSettings: [String: Any] = [ AVVideoCodecKey : AVVideoCodecType.h264,
AVVideoWidthKey : size.width,
AVVideoHeightKey : size.height ]

assetWriterVideoInput = AVAssetWriterInput(mediaType: .video, outputSettings: outputSettings)
assetWriterVideoInput.expectsMediaDataInRealTime = true

let sourcePixelBufferAttributes: [String: Any] = [
kCVPixelBufferPixelFormatTypeKey as String : kCVPixelFormatType_32BGRA,
kCVPixelBufferWidthKey as String : size.width,
kCVPixelBufferHeightKey as String : size.height ]

assetWriterPixelBufferInput = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput: assetWriterVideoInput,
sourcePixelBufferAttributes: sourcePixelBufferAttributes)

assetWriter.add(assetWriterVideoInput)
}

func startRecording() {
assetWriter.startWriting()
assetWriter.startSession(atSourceTime: .zero)

recordingStartTime = CACurrentMediaTime()
isRecording = true
}

func endRecording(_ completionHandler: @escaping () -> ()) {
isRecording = false

assetWriterVideoInput.markAsFinished()
assetWriter.finishWriting(completionHandler: completionHandler)
}

func writeFrame(forTexture texture: MTLTexture) {
if !isRecording {
return
}

while !assetWriterVideoInput.isReadyForMoreMediaData {}

guard let pixelBufferPool = assetWriterPixelBufferInput.pixelBufferPool else {
print("Pixel buffer asset writer input did not have a pixel buffer pool available; cannot retrieve frame")
return
}

var maybePixelBuffer: CVPixelBuffer? = nil
let status = CVPixelBufferPoolCreatePixelBuffer(nil, pixelBufferPool, &maybePixelBuffer)
if status != kCVReturnSuccess {
print("Could not get pixel buffer from asset writer input; dropping frame...")
return
}

guard let pixelBuffer = maybePixelBuffer else { return }

CVPixelBufferLockBaseAddress(pixelBuffer, [])
let pixelBufferBytes = CVPixelBufferGetBaseAddress(pixelBuffer)!

// Use the bytes per row value from the pixel buffer since its stride may be rounded up to be 16-byte aligned
let bytesPerRow = CVPixelBufferGetBytesPerRow(pixelBuffer)
let region = MTLRegionMake2D(0, 0, texture.width, texture.height)

texture.getBytes(pixelBufferBytes, bytesPerRow: bytesPerRow, from: region, mipmapLevel: 0)

let frameTime = CACurrentMediaTime() - recordingStartTime
let presentationTime = CMTimeMakeWithSeconds(frameTime, preferredTimescale: 240)
assetWriterPixelBufferInput.append(pixelBuffer, withPresentationTime: presentationTime)

CVPixelBufferUnlockBaseAddress(pixelBuffer, [])
}
}

在初始化其中一个并调用 startRecording() 之后,您可以将预定的处理程序添加到包含渲染命令的命令缓冲区并调用 writeFrame(在结束编码之后,但在呈现可绘制对象或提交缓冲区之前):

let texture = currentDrawable.texture
commandBuffer.addCompletedHandler { commandBuffer in
self.recorder.writeFrame(forTexture: texture)
}

当您完成录制后,只需调用endRecording,视频文件将完成并关闭。

注意事项:

此类假定源纹理为默认格式 .bgra8Unorm。如果不是,您将遇到崩溃或损坏。如有必要,使用计算或片段着色器转换纹理,或使用 Accelerate。

此类还假定纹理与视频帧的大小相同。如果不是这种情况(如果绘图大小发生变化,或者您的屏幕自动旋转),输出将被破坏并且您可能会看到崩溃。通过根据您的应用程序需要缩放或裁剪源纹理来缓解这种情况。

关于ios - 实时将 Metal MTKView 捕获为电影?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/43838089/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com