gpt4 book ai didi

ios - CIContext + createCGImage + UIImage 崩溃

转载 作者:行者123 更新时间:2023-11-29 00:37:10 24 4
gpt4 key购买 nike

我在做什么:

我正在从 AVFoundation 中的 didOutputSampleBuffer 获取 CMSampleBuffer 并运行一些过滤器并将它们输出到 UIImage> 每次代表吐出一个缓冲区。

什么在起作用:

所有过滤器都工作正常。它给了我想要的输出。在新手机(iPhone 6/6s/7)上一切正常;但是,在 iPhone 5s 上它会在几秒钟后卡住

过滤器和 UIImage 输出:

let inputImage = self.bufferImage!

let filter = CIFilter(name: "CIPixellate")
let beginImage = inputImage
filter!.setValue(beginImage, forKey: kCIInputImageKey)

let filter3 = CIFilter(name: "CIColorMonochrome")
filter3!.setValue(filter!.outputImage, forKey: kCIInputImageKey)
filter3!.setValue(CIColor(red: 1, green:0, blue: 0), forKey: kCIInputColorKey)
filter3!.setValue(200.0, forKey: kCIInputIntensityKey)

let filter2 = CIFilter(name: "CIMultiplyBlendMode")
filter2!.setValue(filter3!.outputImage, forKey: kCIInputImageKey)
filter2!.setValue(inputImage, forKey: kCIInputBackgroundImageKey)
let output2 = filter2!.outputImage

let cgimg = self.context.createCGImage(output2!, fromRect: output2!.extent)
let newImage = UIImage(CGImage: cgimg!)
dispatch_sync(dispatch_get_main_queue()) {
self.imageView?.image = newImage
}
self.context.clearCaches()

我将 CIContext 创建为:

let context = CIContext(options: nil)

我还尝试强制 CIContext 在硬件上呈现,反之亦然。

我觉得,内存/空间/泄漏/等用完了,尽管当它卡住时,Xcode 中没有错误,只是应用程序处于卡住状态。我在最后添加了 self.context.clearCaches() ,如果没有真正改变原始问题的话。

这只发生在速度较慢的设备上 - 在这种情况下,5S 在 6/6s/7 上运行平稳,没有任何问题。

我的完整didOutputSampleBuffer供引用:

func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
connection.videoOrientation = .Portrait
let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)

CVPixelBufferLockBaseAddress(imageBuffer!, CVPixelBufferLockFlags(rawValue: CVOptionFlags(0)))

let baseAddress = CVPixelBufferGetBaseAddress(imageBuffer!)

let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer!)

let width = CVPixelBufferGetWidth(imageBuffer!)
let height = CVPixelBufferGetHeight(imageBuffer!)

let colorSpace = CGColorSpaceCreateDeviceRGB()

let bitmap = CGBitmapInfo(rawValue: CGBitmapInfo.ByteOrder32Little.rawValue|CGImageAlphaInfo.PremultipliedFirst.rawValue)
let context = CGBitmapContextCreate(baseAddress, width, height, 8,
bytesPerRow, colorSpace, bitmap.rawValue)

let quartzImage = CGBitmapContextCreateImage(context!)

CVPixelBufferUnlockBaseAddress(imageBuffer!,CVPixelBufferLockFlags(rawValue: CVOptionFlags(0)))

self.bufferImage = CIImage(CGImage: quartzImage!)

let inputImage = self.bufferImage!

let filter = CIFilter(name: "CIPixellate")
let beginImage = inputImage
filter!.setValue(beginImage, forKey: kCIInputImageKey)

let filter3 = CIFilter(name: "CIColorMonochrome")
filter3!.setValue(filter!.outputImage, forKey: kCIInputImageKey)
filter3!.setValue(CIColor(red: 1, green:0, blue: 0), forKey: kCIInputColorKey)
filter3!.setValue(200.0, forKey: kCIInputIntensityKey)

let filter2 = CIFilter(name: "CIMultiplyBlendMode")
filter2!.setValue(filter3!.outputImage, forKey: kCIInputImageKey)
filter2!.setValue(inputImage, forKey: kCIInputBackgroundImageKey)
let output2 = filter2!.outputImage

let cgimg = self.context.createCGImage(output2!, fromRect: output2!.extent)
let newImage = UIImage(CGImage: cgimg!)

dispatch_async(dispatch_get_main_queue()) {
self.imageView?.image = newImage
}
self.context.clearCaches()
}

更新

我能够通过更改使像素缓冲区成为 CIImage 的方法来修复卡住:

let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
self.bufferImage = CIImage(CVPixelBuffer: pixelBuffer)

去掉了didOutputSampleBuffer开头的大部分代码。

但是,现在 CPU 使用率非常高!Xcode 显示“能量影响”很高!

最佳答案

你是说:

dispatch_sync(dispatch_get_main_queue()) {
self.imageView?.image = newImage
}

您没有理由等待此调用的结果。请改用 dispatch_async

(更好的办法是:找出你是否在主线程上。如果是,则根本不要使用 dispatch 任何东西。)

关于ios - CIContext + createCGImage + UIImage 崩溃,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/40332403/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com