gpt4 book ai didi

ios - 在 iOS 中为 AVCaptureDevice 的输出设置 GrayScale

转载 作者:IT王子 更新时间:2023-10-29 05:37:10 25 4
gpt4 key购买 nike

我想在我的应用中实现自定义相机。所以,我正在使用 AVCaptureDevice 创建这个相机。

现在我只想在我的自定义相机中显示灰色输出。因此,我尝试使用 setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:AVCaptureWhiteBalanceGains 来获取它。我正在使用 AVCamManual: Extending AVCam to Use Manual Capture为此。

- (void)setWhiteBalanceGains:(AVCaptureWhiteBalanceGains)gains
{
NSError *error = nil;

if ( [videoDevice lockForConfiguration:&error] ) {
AVCaptureWhiteBalanceGains normalizedGains = [self normalizedGains:gains]; // Conversion can yield out-of-bound values, cap to limits
[videoDevice setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:normalizedGains completionHandler:nil];
[videoDevice unlockForConfiguration];
}
else {
NSLog( @"Could not lock device for configuration: %@", error );
}
}

但为此,我必须传递 1 到 4 之间的 RGB 增益值。所以我创建了这个方法来检查 MAX 和 MIN 值。

- (AVCaptureWhiteBalanceGains)normalizedGains:(AVCaptureWhiteBalanceGains) gains
{
AVCaptureWhiteBalanceGains g = gains;

g.redGain = MAX( 1.0, g.redGain );
g.greenGain = MAX( 1.0, g.greenGain );
g.blueGain = MAX( 1.0, g.blueGain );

g.redGain = MIN( videoDevice.maxWhiteBalanceGain, g.redGain );
g.greenGain = MIN( videoDevice.maxWhiteBalanceGain, g.greenGain );
g.blueGain = MIN( videoDevice.maxWhiteBalanceGain, g.blueGain );

return g;
}

我也在尝试获得不同的效果,例如传递 RGB 增益静态值。

- (AVCaptureWhiteBalanceGains)normalizedGains:(AVCaptureWhiteBalanceGains) gains
{
AVCaptureWhiteBalanceGains g = gains;
g.redGain = 3;
g.greenGain = 2;
g.blueGain = 1;
return g;
}

现在,我想在我的自定义相机上设置此灰度(公式:像素 = 0.30078125f * R + 0.5859375f * G + 0.11328125f * B)。我已经为这个公式试过了。

- (AVCaptureWhiteBalanceGains)normalizedGains:(AVCaptureWhiteBalanceGains) gains
{
AVCaptureWhiteBalanceGains g = gains;

g.redGain = g.redGain * 0.30078125;
g.greenGain = g.greenGain * 0.5859375;
g.blueGain = g.blueGain * 0.11328125;

float grayScale = g.redGain + g.greenGain + g.blueGain;

g.redGain = MAX( 1.0, grayScale );
g.greenGain = MAX( 1.0, grayScale );
g.blueGain = MAX( 1.0, grayScale );

g.redGain = MIN( videoDevice.maxWhiteBalanceGain, g.redGain );
g.greenGain = MIN( videoDevice.maxWhiteBalanceGain, g.greenGain);
g.blueGain = MIN( videoDevice.maxWhiteBalanceGain, g.blueGain );

return g;
}

那么我怎样才能在 1 到 4 之间传递这个值..?

有什么方法或尺度可以比较这些东西..?

如有任何帮助,我们将不胜感激。

最佳答案

CoreImage 提供了大量过滤器,用于使用 GPU 调整图像,并且可以有效地用于来自相机输入或视频文件的视频数据。

objc.io 上有一篇文章展示如何做到这一点。这些示例在 Objective-C 中,但解释应该足够清楚,可以理解。

基本步骤是:

  1. 创建一个 EAGLContext,配置为使用 OpenGLES2。
  2. 使用 EAGLContext 创建一个 GLKView 来显示呈现的输出。
  3. 创建一个 CIContext,使用相同的 EAGLContext
  4. 使用CIColorMonochrome 创建一个CIFilter CoreImage filter .
  5. 使用 AVCaptureVideoDataOutput 创建一个 AVCaptureSession
  6. AVCaptureVideoDataOutputDelegate 方法中,将 CMSampleBuffer 转换为 CIImage。将 CIFilter 应用于图像。将过滤后的图像绘制到 CIImageContext

此管道确保视频像素缓冲区保留在 GPU 上(从相机到显示器),并避免将数据移动到 CPU,以保持实时性能。

要保存过滤后的视频,实现一个 AVAssetWriter,并将样本缓冲区附加到完成过滤的同一 AVCaptureVideoDataOutputDelegate 中。

这是 Swift 中的示例。

Example on GitHub .

import UIKit
import GLKit
import AVFoundation

private let rotationTransform = CGAffineTransformMakeRotation(CGFloat(-M_PI * 0.5))

class ViewController: UIViewController, AVCaptureVideoDataOutputSampleBufferDelegate {

private var context: CIContext!
private var targetRect: CGRect!
private var session: AVCaptureSession!
private var filter: CIFilter!

@IBOutlet var glView: GLKView!

override func prefersStatusBarHidden() -> Bool {
return true
}

override func viewDidAppear(animated: Bool) {
super.viewDidAppear(animated)

let whiteColor = CIColor(
red: 1.0,
green: 1.0,
blue: 1.0
)

filter = CIFilter(
name: "CIColorMonochrome",
withInputParameters: [
"inputColor" : whiteColor,
"inputIntensity" : 1.0
]
)

// GL context

let glContext = EAGLContext(
API: .OpenGLES2
)

glView.context = glContext
glView.enableSetNeedsDisplay = false

context = CIContext(
EAGLContext: glContext,
options: [
kCIContextOutputColorSpace: NSNull(),
kCIContextWorkingColorSpace: NSNull(),
]
)

let screenSize = UIScreen.mainScreen().bounds.size
let screenScale = UIScreen.mainScreen().scale

targetRect = CGRect(
x: 0,
y: 0,
width: screenSize.width * screenScale,
height: screenSize.height * screenScale
)

// Setup capture session.

let cameraDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)

let videoInput = try? AVCaptureDeviceInput(
device: cameraDevice
)

let videoOutput = AVCaptureVideoDataOutput()
videoOutput.setSampleBufferDelegate(self, queue: dispatch_get_main_queue())

session = AVCaptureSession()
session.beginConfiguration()
session.addInput(videoInput)
session.addOutput(videoOutput)
session.commitConfiguration()
session.startRunning()
}

func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {

guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else {
return
}

let originalImage = CIImage(
CVPixelBuffer: pixelBuffer,
options: [
kCIImageColorSpace: NSNull()
]
)

let rotatedImage = originalImage.imageByApplyingTransform(rotationTransform)

filter.setValue(rotatedImage, forKey: kCIInputImageKey)

guard let filteredImage = filter.outputImage else {
return
}

context.drawImage(filteredImage, inRect: targetRect, fromRect: filteredImage.extent)

glView.display()
}

func captureOutput(captureOutput: AVCaptureOutput!, didDropSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
let seconds = CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer))
print("dropped sample buffer: \(seconds)")
}
}

关于ios - 在 iOS 中为 AVCaptureDevice 的输出设置 GrayScale,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/38122040/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com