gpt4 book ai didi

image - 如何从 -captureStillImageAsynchronouslyFromConnection :completionHandler:? 获得的 CMSampleBuffer 获取 NSImage

转载 作者:行者123 更新时间:2023-12-02 23:16:05 25 4
gpt4 key购买 nike

我有一个 Cocoa 应用程序,旨在从 USB 显微镜捕获静态图像,然后在将它们保存到图像文件之前对其进行一些后处理。目前,我试图从传递给我的 completionHandler block 的 CMSampleBufferRef 获取到 NSImage 或我可以操作的其他表示形式并使用熟悉的 Cocoa API 进行保存。

我在 AVFoundation 文档中找到了函数 imageFromSampleBuffer(),该函数旨在将 CMSampleBuffer 转换为 UIImage(叹气),并且对其进行适当修改以返回 NSImage。但在这种情况下它不起作用,因为对 CMSampleBufferGetImageBuffer() 的调用返回 nil

这是一条日志,显示传递给我的完成 block 的 CMSampleBuffer:

2012-01-21 19:38:36.293 LabCam[1402:cb0f] CMSampleBuffer 0x100335390 retainCount: 1 allocator: 0x7fff8c78620c
invalid = NO
dataReady = YES
makeDataReadyCallback = 0x0
makeDataReadyRefcon = 0x0
buffer-level attachments:
com.apple.cmio.buffer_attachment.discontinuity_flags(P) = 0
com.apple.cmio.buffer_attachment.hosttime(P) = 79631546824089
com.apple.cmio.buffer_attachment.sequence_number(P) = 42
formatDescription = <CMVideoFormatDescription 0x100335220 [0x7fff782fff40]> {
mediaType:'vide'
mediaSubType:'jpeg'
mediaSpecific: {
codecType: 'jpeg' dimensions: 640 x 480
}
extensions: {<CFBasicHash 0x100335160 [0x7fff782fff40]>{type = immutable dict, count = 5,
entries =>
1 : <CFString 0x7fff773dff48 [0x7fff782fff40]>{contents = "Version"} = <CFNumber 0x183 [0x7fff782fff40]>{value = +1, type = kCFNumberSInt32Type}
2 : <CFString 0x7fff773dff68 [0x7fff782fff40]>{contents = "RevisionLevel"} = <CFNumber 0x183 [0x7fff782fff40]>{value = +1, type = kCFNumberSInt32Type}
3 : <CFString 0x7fff7781ab08 [0x7fff782fff40]>{contents = "CVFieldCount"} = <CFNumber 0x183 [0x7fff782fff40]>{value = +1, type = kCFNumberSInt32Type}
4 : <CFString 0x7fff773dfdc8 [0x7fff782fff40]>{contents = "FormatName"} = <CFString 0x7fff76d35fb0 [0x7fff782fff40]>{contents = Photo - JPEG"}
5 : <CFString 0x7fff773dff88 [0x7fff782fff40]>{contents = "Vendor"} = <CFString 0x7fff773dffa8 [0x7fff782fff40]>{contents = "appl"}
}
}
}
sbufToTrackReadiness = 0x0
numSamples = 1
sampleTimingArray[1] = {
{PTS = {2388943236/30000 = 79631.441, rounded}, DTS = {INVALID}, duration = {3698/30000 = 0.123}},
}
sampleSizeArray[1] = {
sampleSize = 55911,
}
dataBuffer = 0x100335300

它显然包含 JPEG 数据,但我如何获取它? (最好保留相关的元数据......)

最佳答案

我最终在另一个代码示例的帮助下解决了这个问题。 CMSampleBufferGetImageBuffer 仅返回相机可用的未压缩 native 图像格式的有效结果。因此,为了让我的程序正常工作,我必须将 AVCaptureStillImageOutput 实例配置为使用 k32BGRAPixelFormat 而不是其默认 (JPEG) 压缩格式。

session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetPhoto;
imageOutput = [[AVCaptureStillImageOutput alloc] init];
// Configure imageOutput for BGRA pixel format [#2].
NSNumber * pixelFormat = [NSNumber numberWithInt:k32BGRAPixelFormat];
[imageOutput setOutputSettings:[NSDictionary dictionaryWithObject:pixelFormat
forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
[session addOutput:imageOutput];

关于image - 如何从 -captureStillImageAsynchronouslyFromConnection :completionHandler:? 获得的 CMSampleBuffer 获取 NSImage,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/8958869/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com