gpt4 book ai didi

ios - 在项目 : creating custom viewcontroller 中集成 ZBar 阅读器

转载 作者:行者123 更新时间:2023-11-29 02:10:18 25 4
gpt4 key购买 nike

我已将此代码添加到我的项目中。它工作正常,从当前 View 创建并显示 ZBarReaderViewController 的实例。

但是,我希望能够定义当前 View Controller 的自定义区域并在该区域内显示 ZBarReaderViewController,同时仍显示我的“上一个/其他” View 。下面的代码以全屏模式显示 View Controller 。

在界面生成器上,我只能在现有 ViewController 中添加 UIView,因此无法将自定义 View 区域关联到 ZBarReaderViewController

我唯一能做的就是将它关联到一个 ZBarReaderView 实例,但是,由于 ZBarReaderViewController 是一个封闭源代码(我只能看到我正在使用的 ZBar reader project 上的头文件)我我无法修改该行为。

我该如何解决这个问题?

(IBAction)startScanning:(id)sender {

NSLog(@"Scanning..");
resultTextView.text = @"Scanning..";

ZBarReaderViewController *codeReader = [ZBarReaderViewController new];
codeReader.readerDelegate=self;
codeReader.supportedOrientationsMask = ZBarOrientationMaskAll;

ZBarImageScanner *scanner = codeReader.scanner;
[scanner setSymbology: ZBAR_I25 config: ZBAR_CFG_ENABLE to: 0];

[self presentViewController:codeReader animated:YES completion:nil];
}

最佳答案

所以这是一个扫描仪 View Controller 的例子。我使用 Storyboard来创建 View ,但您也可以通过编程方式或使用常规 Nib 来创建 View 。

首先,创建您的 View (假设在 Storyboard中)并在其中放置一个 UIView,您希望在其中显示扫描仪。

现在,让我们看一下 View Controller (请查看其中的注释):

#import <AVFoundation/AVFoundation.h>
#import "ScannerViewController.h"

@interface ScannerViewController () <AVCaptureMetadataOutputObjectsDelegate>

// UI
@property (weak, nonatomic) IBOutlet UIView *viewPreview; // Connect it to the view you created in the storyboard, for the scanner preview

// Video
@property (nonatomic, strong) AVCaptureSession *captureSession;
@property (nonatomic, strong) AVCaptureVideoPreviewLayer *videoPreviewLayer;
@property (nonatomic, strong) AVAudioPlayer *audioPlayer;
@property (nonatomic, strong) AVCaptureSession *flashLightSession;
@property (nonatomic) BOOL isReading;

@end

@implementation ScannerViewController

- (void)viewDidLoad
{
[super viewDidLoad];

// Initially make the captureSession object nil.
_captureSession = nil;

// Set the initial value of the flag to NO.
_isReading = NO;
}

- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
}

- (void)viewDidAppear:(BOOL)animated
{
[super viewDidAppear:animated];

[self startStopReading:nil];
}

- (IBAction)startStopReading:(id)sender
{
if (!_isReading) {
[self startReading];
}
else {
// In this case the app is currently reading a QR code and it should stop doing so.
[self stopReading];

}

// Set to the flag the exact opposite value of the one that currently has.
_isReading = !_isReading;
}

#pragma mark - Private

- (BOOL)startReading
{
NSError *error;

// Get an instance of the AVCaptureDevice class to initialize a device object and provide the video
// as the media type parameter.
AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

// Get an instance of the AVCaptureDeviceInput class using the previous device object.
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:&error];

if (!input) {
// If any error occurs, simply log the description of it and don't continue any more.
NSLog(@"%@", [error localizedDescription]);
return NO;
}

// Initialize the captureSession object.
_captureSession = [[AVCaptureSession alloc] init];
// Set the input device on the capture session.
[_captureSession addInput:input];

// Initialize a AVCaptureMetadataOutput object and set it as the output device to the capture session.
AVCaptureMetadataOutput *captureMetadataOutput = [[AVCaptureMetadataOutput alloc] init];
[_captureSession addOutput:captureMetadataOutput];

// Create a new serial dispatch queue.
dispatch_queue_t dispatchQueue;
dispatchQueue = dispatch_queue_create("myQueue", NULL);
[captureMetadataOutput setMetadataObjectsDelegate:self queue:dispatchQueue];
[captureMetadataOutput setMetadataObjectTypes:@[AVMetadataObjectTypeQRCode]]; // Add all the types you need, currently it is just QR code

// Initialize the video preview layer and add it as a sublayer to the viewPreview view's layer.
_videoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
[_videoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[_videoPreviewLayer setFrame:_viewPreview.layer.bounds];
[_viewPreview.layer addSublayer:_videoPreviewLayer];

// Start video capture.
[_captureSession startRunning];

return YES;
}

- (void)stopReading
{
// Stop video capture and make the capture session object nil.
[_captureSession stopRunning];
_captureSession = nil;

// Remove the video preview layer from the viewPreview view's layer.
[_videoPreviewLayer removeFromSuperlayer];
}

#pragma mark - AVCaptureMetadataOutputObjectsDelegate

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection
{
// Check if the metadataObjects array is not nil and it contains at least one object.
if (metadataObjects != nil && [metadataObjects count] > 0) {

[self performSelectorOnMainThread:@selector(stopReading) withObject:nil waitUntilDone:NO];

_isReading = NO;

// If the audio player is not nil, then play the sound effect.
if (_audioPlayer) {
[_audioPlayer play];
}

// This was my result, but you can search the metadataObjects array for what you need exactly
NSString *code = [(AVMetadataMachineReadableCodeObject *)[metadataObjects objectAtIndex:0] stringValue];

}

}

关于ios - 在项目 : creating custom viewcontroller 中集成 ZBar 阅读器,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/29344947/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com