- android - RelativeLayout 背景可绘制重叠内容
- android - 如何链接 cpufeatures lib 以获取 native android 库?
- java - OnItemClickListener 不起作用,但 OnLongItemClickListener 在自定义 ListView 中起作用
- java - Android 文件转字符串
我想从 AVCaptureSession 的实时馈送中提取帧,我正在使用 Apple 的 AVCam 作为测试用例。这是 AVCam 的链接:
https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html
我发现 captureOutput:didOutputSampleBuffer:fromConnection
没有被调用,我想知道为什么或我做错了什么。
这是我所做的:
(1) 我让 AVCamViewController
成为委托(delegate)
@interface AVCamViewController () <AVCaptureFileOutputRecordingDelegate, AVCaptureVideoDataOutputSampleBufferDelegate>
(2) 我创建了一个 AVCaptureVideoDataOutput
对象并将其添加到 session 中
AVCaptureVideoDataOutput *videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
if ([session canAddOutput:videoDataOutput])
{
[session addOutput:videoDataOutput];
}
(3) 我添加了委托(delegate)方法并通过记录一个随机字符串进行测试
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
NSLog(@"I am called");
}
测试应用程序有效,但未调用 captureOutput:didOutputSampleBuffer:fromConnection。
(4) 我继续读到 AVCaptureSession *session = [[AVCaptureSession alloc] init];
中的 session 变量在 viewDidLoad 中是本地的可能是委托(delegate)未被调用的一个可能原因,我使它成为 AVCamViewController 类的实例变量,但它没有被调用。
这是我正在测试的 viewDidLoad 方法(取自 AVCam),我在方法末尾添加了 AVCaptureDataOutput:
- (void)viewDidLoad
{
[super viewDidLoad];
// Create the AVCaptureSession
session = [[AVCaptureSession alloc] init];
[self setSession:session];
// Setup the preview view
[[self previewView] setSession:session];
// Check for device authorization
[self checkDeviceAuthorizationStatus];
// In general it is not safe to mutate an AVCaptureSession or any of its inputs, outputs, or connections from multiple threads at the same time.
// Why not do all of this on the main queue?
// -[AVCaptureSession startRunning] is a blocking call which can take a long time. We dispatch session setup to the sessionQueue so that the main queue isn't blocked (which keeps the UI responsive).
dispatch_queue_t sessionQueue = dispatch_queue_create("session queue", DISPATCH_QUEUE_SERIAL);
[self setSessionQueue:sessionQueue];
dispatch_async(sessionQueue, ^{
[self setBackgroundRecordingID:UIBackgroundTaskInvalid];
NSError *error = nil;
AVCaptureDevice *videoDevice = [AVCamViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionBack];
AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (error)
{
NSLog(@"%@", error);
}
if ([session canAddInput:videoDeviceInput])
{
[session addInput:videoDeviceInput];
[self setVideoDeviceInput:videoDeviceInput];
dispatch_async(dispatch_get_main_queue(), ^{
// Why are we dispatching this to the main queue?
// Because AVCaptureVideoPreviewLayer is the backing layer for AVCamPreviewView and UIView can only be manipulated on main thread.
// Note: As an exception to the above rule, it is not necessary to serialize video orientation changes on the AVCaptureVideoPreviewLayer’s connection with other session manipulation.
[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] setVideoOrientation:(AVCaptureVideoOrientation)[self interfaceOrientation]];
});
}
AVCaptureDevice *audioDevice = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject];
AVCaptureDeviceInput *audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];
if (error)
{
NSLog(@"%@", error);
}
if ([session canAddInput:audioDeviceInput])
{
[session addInput:audioDeviceInput];
}
AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
if ([session canAddOutput:movieFileOutput])
{
[session addOutput:movieFileOutput];
AVCaptureConnection *connection = [movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
if ([connection isVideoStabilizationSupported])
[connection setEnablesVideoStabilizationWhenAvailable:YES];
[self setMovieFileOutput:movieFileOutput];
}
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
if ([session canAddOutput:stillImageOutput])
{
[stillImageOutput setOutputSettings:@{AVVideoCodecKey : AVVideoCodecJPEG}];
[session addOutput:stillImageOutput];
[self setStillImageOutput:stillImageOutput];
}
AVCaptureVideoDataOutput *videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
[videoDataOutput setSampleBufferDelegate:self queue:sessionQueue];
if ([session canAddOutput:videoDataOutput])
{
NSLog(@"Yes I can add it");
[session addOutput:videoDataOutput];
}
});
}
- (void)viewWillAppear:(BOOL)animated
{
dispatch_async([self sessionQueue], ^{
[self addObserver:self forKeyPath:@"sessionRunningAndDeviceAuthorized" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:SessionRunningAndDeviceAuthorizedContext];
[self addObserver:self forKeyPath:@"stillImageOutput.capturingStillImage" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:CapturingStillImageContext];
[self addObserver:self forKeyPath:@"movieFileOutput.recording" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:RecordingContext];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(subjectAreaDidChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:[[self videoDeviceInput] device]];
__weak AVCamViewController *weakSelf = self;
[self setRuntimeErrorHandlingObserver:[[NSNotificationCenter defaultCenter] addObserverForName:AVCaptureSessionRuntimeErrorNotification object:[self session] queue:nil usingBlock:^(NSNotification *note) {
AVCamViewController *strongSelf = weakSelf;
dispatch_async([strongSelf sessionQueue], ^{
// Manually restarting the session since it must have been stopped due to an error.
[[strongSelf session] startRunning];
[[strongSelf recordButton] setTitle:NSLocalizedString(@"Record", @"Recording button record title") forState:UIControlStateNormal];
});
}]];
[[self session] startRunning];
});
}
有人可以告诉我原因和解决方法的建议吗?
最佳答案
我对此做了很多试验,我想我可能已经找到了答案。我有类似但不同的代码,这些代码是从头开始编写的,而不是从 Apple 的示例(现在有点旧)中复制的。
我认为这是部分...
AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
if ([session canAddOutput:movieFileOutput])
{
[session addOutput:movieFileOutput];
AVCaptureConnection *connection = [movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
if ([connection isVideoStabilizationSupported])
[connection setEnablesVideoStabilizationWhenAvailable:YES];
[self setMovieFileOutput:movieFileOutput];
}
根据我的实验,这是导致您出现问题的原因。在我的代码中,当它存在时 captureOutput:didOutputSampleBuffer:fromConnection
不会被调用。我认为视频系统要么为您提供一系列样本缓冲区,要么将压缩的、优化的电影文件记录到磁盘,而不是两者。 (至少在 iOS 上是这样。)我想这是有道理的/不足为奇,但我还没有在任何地方看到它的记录!
此外,在某一时刻,我似乎遇到了错误和/或当我打开麦克风时缓冲区回调没有发生。再次未记录,这些是错误 -11800(未知错误)。但我不能总是重现这一点。
关于iOS:未调用 captureOutput:didOutputSampleBuffer:fromConnection,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/25110055/
iOS 4.3 测试版 1 我不知道发生了什么,但不知何故,新 Beta 中缺少 sampleBuffer 的 MetaDictionary。任何人都可以批准吗? 我总是得到这样的 MetaDicti
我想从 AVCaptureSession 的实时馈送中提取帧,我正在使用 Apple 的 AVCam 作为测试用例。这是 AVCam 的链接: https://developer.apple.com/
我正在为 iOS 4+ 编写一个视频捕捉应用程序。它在 ios 5+ 的设备上工作正常,但在 ios 4+ 中,在录制停止后不会调用委托(delegate) didFinishRecordingToO
我使用 AVCaptureSessionPhoto 允许用户拍摄高分辨率照片。拍照后,我使用 captureOutput:didOutputSampleBuffer:fromConnection: 方
我需要在我的 captureOutput:didOutputSampleBuffer:fromConnection: 方法中有选择地(可靠地)关闭 sampleBuffers 的处理。如您所知,它是从
我试图更好地理解 AVFoundation 框架以及各种 Core xxxx 框架,因此我决定尝试一个简单的视频捕获,看看是否可以将图像输出到 UI。我查看了 rosyWriter 代码和文档,但没有
我正在使用 AVCaptureVideoDataOutput 的 captureOutput:didOutputSampleBuffer:fromConnection: 委托(delegate)方法。
我是一名优秀的程序员,十分优秀!