gpt4 book ai didi

ios - CMSampleBufferGetImageBuffer 中的内存泄漏

转载 作者:塔克拉玛干 更新时间:2023-11-02 09:39:57 25 4
gpt4 key购买 nike

我每 N 个视频帧从 CMSampleBufferRef 视频缓冲区获取一个 UIImage,例如:

- (void)imageFromVideoBuffer:(void(^)(UIImage* image))completion {
CMSampleBufferRef sampleBuffer = _myLastSampleBuffer;
if (sampleBuffer != nil) {
CFRetain(sampleBuffer);
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:CMSampleBufferGetImageBuffer(sampleBuffer)];
_lastAppendedVideoBuffer.sampleBuffer = nil;
if (_context == nil) {
_context = [CIContext contextWithOptions:nil];
}
CVPixelBufferRef buffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CGImageRef cgImage = [_context createCGImage:ciImage fromRect:
CGRectMake(0, 0, CVPixelBufferGetWidth(buffer), CVPixelBufferGetHeight(buffer))];
__block UIImage *image = [UIImage imageWithCGImage:cgImage];

CGImageRelease(cgImage);
CFRelease(sampleBuffer);

if(completion) completion(image);

return;
}
if(completion) completion(nil);
}

XCode 和 Instruments 检测到内存泄漏,但我无法消除它。我像往常一样发布 CGImageRef 和 CMSampleBufferRef:

CGImageRelease(cgImage);
CFRelease(sampleBuffer);

[更新]我放入 AVCapture 输出回​​调以获取 sampleBuffer

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
if (captureOutput == _videoOutput) {
_lastVideoBuffer.sampleBuffer = sampleBuffer;
id<CIImageRenderer> imageRenderer = _CIImageRenderer;

dispatch_async(dispatch_get_main_queue(), ^{
@autoreleasepool {
CIImage *ciImage = nil;
ciImage = [CIImage imageWithCVPixelBuffer:CMSampleBufferGetImageBuffer(sampleBuffer)];
if(_context==nil) {
_context = [CIContext contextWithOptions:nil];
}
CGImageRef processedCGImage = [_context createCGImage:ciImage
fromRect:[ciImage extent]];
//UIImage *image=[UIImage imageWithCGImage:processedCGImage];
CGImageRelease(processedCGImage);
NSLog(@"Captured image %@", ciImage);
}
});

泄漏的代码是 createCGImage:ciImage:

CGImageRef processedCGImage = [_context createCGImage:ciImage
fromRect:[ciImage extent]];

甚至有一个 autoreleasepoolCGImage 引用的 CGImageRelease 和一个 CIContext 作为实例属性。

这似乎与此处解决的问题相同:Can't save CIImage to file on iOS without memory leaks

[更新]泄漏似乎是由于错误。这个问题在 Memory leak on CIContext createCGImage at iOS 9?

示例项目展示了如何重现此泄漏:http://www.osamu.co.jp/DataArea/VideoCameraTest.zip

最后的评论保证

It looks like they fixed this in 9.1b3. If anyone needs a workaround that works on iOS 9.0.x, I was able to get it working with this:

在测试代码中(在本例中为 Swift):

[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)  
{
if (error) return;

__block NSString *filePath = [NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:@"ipdf_pic_%i.jpeg",(int)[NSDate date].timeIntervalSince1970]];

NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
dispatch_async(dispatch_get_main_queue(), ^
{

@autoreleasepool
{
CIImage *enhancedImage = [CIImage imageWithData:imageData];

if (!enhancedImage) return;

static CIContext *ctx = nil; if (!ctx) ctx = [CIContext contextWithOptions:nil];

CGImageRef imageRef = [ctx createCGImage:enhancedImage fromRect:enhancedImage.extent format:kCIFormatBGRA8 colorSpace:nil];

UIImage *image = [UIImage imageWithCGImage:imageRef scale:1.0 orientation:UIImageOrientationRight];

[[NSFileManager defaultManager] createFileAtPath:filePath contents:UIImageJPEGRepresentation(image, 0.8) attributes:nil];

CGImageRelease(imageRef);
}
});
}];

iOS9.0 的解决方法应该是

extension CIContext {  
func createCGImage_(image:CIImage, fromRect:CGRect) -> CGImage {
let width = Int(fromRect.width)
let height = Int(fromRect.height)

let rawData = UnsafeMutablePointer<UInt8>.alloc(width * height * 4)
render(image, toBitmap: rawData, rowBytes: width * 4, bounds: fromRect, format: kCIFormatRGBA8, colorSpace: CGColorSpaceCreateDeviceRGB())
let dataProvider = CGDataProviderCreateWithData(nil, rawData, height * width * 4) {info, data, size in UnsafeMutablePointer<UInt8>(data).dealloc(size)}
return CGImageCreate(width, height, 8, 32, width * 4, CGColorSpaceCreateDeviceRGB(), CGBitmapInfo(rawValue: CGImageAlphaInfo.PremultipliedLast.rawValue), dataProvider, nil, false, .RenderingIntentDefault)!
}
}

最佳答案

我们在我们创建的应用程序中遇到了类似的问题,我们正在使用 OpenCV 处理每个帧的特征关键点,并每隔几秒发送一个帧。运行一段时间后,我们会收到不少内存压力消息。

我们设法通过在它自己的自动释放池中运行我们的处理代码来纠正这个问题(jpegDataFromSampleBufferAndCrop 做的事情与您正在做的事情类似,增加了裁剪):

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
@autoreleasepool {

if ([self.lastFrameSentAt timeIntervalSinceNow] < -kContinuousRateInSeconds) {

NSData *imageData = [self jpegDataFromSampleBufferAndCrop:sampleBuffer];

if (imageData) {
[self processImageData:imageData];
}

self.lastFrameSentAt = [NSDate date];

imageData = nil;
}
}
}
}

关于ios - CMSampleBufferGetImageBuffer 中的内存泄漏,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/32685756/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com