gpt4 book ai didi

ios - CIContext drawImage 导致 EXC_BAD_ACCESS - iOS 6

转载 作者:行者123 更新时间:2023-11-28 22:33:15 24 4
gpt4 key购买 nike

我正在尝试将一个简单的 Core Image Filter 应用于实时摄像机输入。我认为我的代码没问题,但在 captureOutput 方法中使用方法 drawImage:inRect:fromRect 会导致 EXC_BAD_ACCESS[__NSCFNumber drawImage:inRect:fromRect :]: unrecognized 选择器,这让我觉得当我尝试在其上调用 drawImage 时我的上下文已被释放。这对我来说没有意义,因为我的 CIContext 是类(class)成员。

问题似乎不是来自 OpenGL,因为我尝试了一个简单的上下文(不是从 EAGLContext 创建的)并且我遇到了同样的问题。

我正在装有 ios 6 的 iphone 5 上测试它,因为相机在模拟器上不工作。

你能帮我吗?非常感谢您的宝贵时间

我有我的 .h 文件:

<!-- language: c# -->

// CameraController.h

#import <UIKit/UIKit.h>
#import <OpenGLES/EAGL.h>
#import <AVFoundation/AVFoundation.h>
#import <GLKit/GLKit.h>
#import <CoreMedia/CoreMedia.h>
#import <CoreVideo/CoreVideo.h>
#import <QuartzCore/QuartzCore.h>
#import <CoreImage/CoreImage.h>
#import <ImageIO/ImageIO.h>

@interface CameraController : GLKViewController <AVCaptureVideoDataOutputSampleBufferDelegate>{

AVCaptureSession *avCaptureSession;
CIContext *coreImageContext;
CIContext *ciTestContext;
GLuint _renderBuffer;
EAGLContext *glContext;
}

@end

和我的 .m 文件

<!-- language: c# -->

// CameraController.m

#import "CameraController.h"

@interface CameraController ()

@end

@implementation CameraController

- (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil
{
self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil];
if (self) {

}
return self;
}

- (void)viewDidLoad
{
[super viewDidLoad];

// Initialize Open GL ES2 Context
glContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
if (!glContext) {
NSLog(@"Failed to create ES context");
}
[EAGLContext setCurrentContext:nil];

// Gets the GL View and sets the depth format to 24 bits, and the context of the view to be the Open GL context created above
GLKView *view = (GLKView *)self.view;
view.context = glContext;
view.drawableDepthFormat = GLKViewDrawableDepthFormat24;

// Creates CI Context from EAGLContext
NSMutableDictionary *options = [[NSMutableDictionary alloc] init];
[options setObject: [NSNull null] forKey: kCIContextWorkingColorSpace];
coreImageContext = [CIContext contextWithEAGLContext:glContext options:options];

glGenRenderbuffers(1, &_renderBuffer);
glBindRenderbuffer(GL_RENDERBUFFER, _renderBuffer);

// Initialize Video Capture Device
NSError *error;
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];

// Initialize Video Output object and set output settings
AVCaptureVideoDataOutput *dataOutput = [[AVCaptureVideoDataOutput alloc] init];

[dataOutput setAlwaysDiscardsLateVideoFrames:YES];
[dataOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
forKey:(id)kCVPixelBufferPixelFormatTypeKey]];


// Delegates the SampleBuffer to the current object which implements the AVCaptureVideoDataOutputSampleBufferDelegate interface via the captureOutput method
[dataOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];

// Initialize the capture session, add input, output, start urnning
avCaptureSession = [[AVCaptureSession alloc] init];
[avCaptureSession beginConfiguration];
[avCaptureSession setSessionPreset:AVCaptureSessionPreset1280x720];
[avCaptureSession addInput:input];
[avCaptureSession addOutput:dataOutput];
[avCaptureSession commitConfiguration];
[avCaptureSession startRunning];


}

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {

// Creates a CIImage from the sample buffer of the camera frame
CVPixelBufferRef pixelBuffer = (CVPixelBufferRef)CMSampleBufferGetImageBuffer(sampleBuffer);
CIImage *inputImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];

// Creates the relevant filter
CIFilter *filter = [CIFilter filterWithName:@"CISepiaTone"];
[filter setValue:inputImage forKey:kCIInputImageKey];
[filter setValue:[NSNumber numberWithFloat:0.8f] forKey:@"InputIntensity"];

// Creates a reference to the output of the filter
CIImage *result = [filter valueForKey:kCIOutputImageKey];

// Draw to the context
[coreImageContext drawImage:result inRect:[result extent] fromRect:[result extent]]; // 5

[glContext presentRenderbuffer:GL_RENDERBUFFER];
}

- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}


@end

最佳答案

在您的 viewDidLoad 方法中,您有:

coreImageContext = [CIContext contextWithEAGLContext:glContext options:options];

如果要在captureOutput方法中使用,需要保留coreImageContext。

关于ios - CIContext drawImage 导致 EXC_BAD_ACCESS - iOS 6,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/16843093/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com