gpt4 book ai didi

ios - GNAudioSourceMic原始音频位置

转载 作者:行者123 更新时间:2023-12-03 02:11:06 25 4
gpt4 key购买 nike

我目前正在开发一个使用Gracenote Mobile Client创建指纹以及识别我正在听的音乐的应用程序。我已经在我的项目中成功实现了它,但是由于业务需求,我不得不将Gracenote录制的音频用于其他处理。

关键是:由于GNAudioSourceMic封装了整个麦克风录制操作,例如startRecording / stopRecording,因此我无法访问麦克风原始音频。

这是我正在使用的代码:

- (void)viewDidLoad
{
[super viewDidLoad];
[self setNeedsStatusBarAppearanceUpdate];
[self setupUI];

@try {
self.config = [GNConfig init:GRACENOTE_CLIENTID];
}
@catch (NSException * e) {
NSLog(@"%s clientId can't be nil or the empty string",__PRETTY_FUNCTION__);
[self.view setUserInteractionEnabled:FALSE];
return;
}

// Debug is disabled in the GUI by default
#ifdef DEBUG
[self.config setProperty:@"debugEnabled" value:@"1"];
#else
[self.config setProperty:@"debugEnabled" value:@"0"];
#endif
[self.config setProperty:@"lookupmodelocalonly" value:@"0"];

// -------------------------------------------------------------------------------
//Init AudioSource to Start Recording.
// -------------------------------------------------------------------------------

self.recognizeFromPCM = [GNRecognizeStream gNRecognizeStream:self.config];
self.audioConfig = [GNAudioConfig gNAudioConfigWithSampleRate:44100 bytesPerSample:2 numChannels:1];

self.objAudioSource = [GNAudioSourceMic gNAudioSourceMic:self.audioConfig];
self.objAudioSource.delegate=self;

NSError *err;

RecognizeStreamOperation *op = [RecognizeStreamOperation recognizeStreamOperation:self.config];
op.viewControllerDelegate = self;
err = [self.recognizeFromPCM startRecognizeSession:op audioConfig:self.audioConfig];

if (err) {
NSLog(@"ERROR: %@",[err localizedDescription]);
}

[self.objAudioSource startRecording];

[self performSelectorInBackground:@selector(setUpRecognizePCMSession) withObject:nil];

}

-(void) startRecordMicrophone{
#ifdef DEBUG
NSLog(@"%s startRecording",__PRETTY_FUNCTION__);
#endif

NSError *error;
error = [self.recognizeFromPCM idNow];

if (error) {
NSLog(@"ERROR: %@",[error localizedDescription]);
}

}

是否有人面临与上述相同的需求?

提前致谢

最佳答案

经过昨天的大量搜寻之后,我想出了一个解决方案,它不是我以前期望的,但是它可以按我的意愿很好地工作。我决定自己录制iOS麦克风,然后在Grancenote SDK上调用一个方法来识别我刚刚录制的内容。

这是对我有用的东西。

MicrophoneInput.h

#import <Foundation/Foundation.h>
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>

@interface MicrophoneInput : UIViewController {
AVAudioPlayer *audioPlayer;
AVAudioRecorder *audioRecorder;
int recordEncoding;
enum
{
ENC_AAC = 1,
ENC_ALAC = 2,
ENC_IMA4 = 3,
ENC_ILBC = 4,
ENC_ULAW = 5,
ENC_PCM = 6,
} encodingTypes;
}

-(IBAction) startRecording;
-(IBAction) stopRecording;

@end

MicrophoneInput.m
#import "MicrophoneInput.h"


@implementation MicrophoneInput

- (void)viewDidLoad
{
[super viewDidLoad];
recordEncoding = ENC_PCM;
}

-(IBAction) startRecording
{
NSLog(@"startRecording");
[audioRecorder release];
audioRecorder = nil;

// Init audio with record capability
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory:AVAudioSessionCategoryRecord error:nil];

NSMutableDictionary *recordSettings = [[NSMutableDictionary alloc] initWithCapacity:10];
recordSettings[AVFormatIDKey] = @(kAudioFormatLinearPCM);
recordSettings[AVSampleRateKey] = @8000.0f;
recordSettings[AVNumberOfChannelsKey] = @1;
recordSettings[AVLinearPCMBitDepthKey] = @16;
recordSettings[AVLinearPCMIsBigEndianKey] = @NO;
recordSettings[AVLinearPCMIsFloatKey] = @NO;

//set the export session's outputURL to <Documents>/output.caf
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = paths[0];
NSURL* outURL = [NSURL fileURLWithPath:[documentsDirectory stringByAppendingPathComponent:@"output.caf"]];
[[NSFileManager defaultManager] removeItemAtURL:outURL error:nil];
NSLog(@"url loc is %@", outURL);

NSError *error = nil;
audioRecorder = [[ AVAudioRecorder alloc] initWithURL:outURL settings:recordSettings error:&error];

if ([audioRecorder prepareToRecord] == YES){
[audioRecorder record];
}else {
int errorCode = CFSwapInt32HostToBig ([error code]);
NSLog(@"Error: %@ [%4.4s])" , [error localizedDescription], (char*)&errorCode);

}
NSLog(@"recording");
}

-(IBAction) stopRecording
{
NSLog(@"stopRecording");
[audioRecorder stop];
NSLog(@"stopped");
}


- (void)dealloc
{
[audioPlayer release];
[audioRecorder release];
[super dealloc];
}


@end

观察:如果您使用的是ARC,请不要忘记在Compiling BuildPhase上添加-fno-objc-arc编译器标志,如下所示。



YourViewController.h
//Libraries
#import <AVFoundation/AVFoundation.h>
#import <AudioToolbox/AudioToolbox.h>

//Echonest Codegen
#import "MicrophoneInput.h"

//GracenoteMusic
#import <GracenoteMusicID/GNRecognizeStream.h>
#import <GracenoteMusicID/GNAudioSourceMic.h>
#import <GracenoteMusicID/GNAudioConfig.h>
#import <GracenoteMusicID/GNCacheStatus.h>
#import <GracenoteMusicID/GNConfig.h>
#import <GracenoteMusicID/GNSampleBuffer.h>
#import <GracenoteMusicID/GNOperations.h>
#import <GracenoteMusicID/GNSearchResponse.h>

@interface YourViewController : UIViewController<GNSearchResultReady>


@end

YourViewController.m
#import "YourViewController.h"

@interface YourViewController ()
//Record
@property(strong,nonatomic) MicrophoneInput* recorder;
@property (strong,nonatomic) GNConfig *config;
@end

@implementation YourViewController


#pragma mark - UIViewController lifecycle

- (void)viewDidLoad
{
[super viewDidLoad];
self.recorder = [[MicrophoneInput alloc] init];

@try {
self.config = [GNConfig init:GRACENOTE_CLIENTID];
}
@catch (NSException * e) {
NSLog(@"%s clientId can't be nil or the empty string",__PRETTY_FUNCTION__);
[self.view setUserInteractionEnabled:FALSE];
return;
}

// Debug is disabled in the GUI by default
#ifdef DEBUG
[self.config setProperty:@"debugEnabled" value:@"1"];
#else
[self.config setProperty:@"debugEnabled" value:@"0"];
#endif
[self.config setProperty:@"lookupmodelocalonly" value:@"0"];
}

-(void)viewDidAppear:(BOOL)animated{
[self performSelectorInBackground:@selector(startRecordMicrophone) withObject:nil];
}

-(void) startRecordMicrophone{
#ifdef DEBUG
NSLog(@"%s startRecording",__PRETTY_FUNCTION__);
#endif
[self.recorder startRecording];
[self performSelectorOnMainThread:@selector(makeMyProgressBarMoving) withObject:nil waitUntilDone:NO];
}

-(void) stopRecordMicrophone{
#ifdef DEBUG
NSLog(@"%s stopRecording",__PRETTY_FUNCTION__);
#endif
[self.recorder stopRecording];

NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = paths[0];
NSString *filePath =[documentsDirectory stringByAppendingPathComponent:@"output.caf"];

NSData* sampleData = [[NSData alloc] initWithContentsOfFile:filePath];
GNSampleBuffer *sampleBuffer = [GNSampleBuffer gNSampleBuffer:sampleData numChannels:1 sampleRate:8000];
[GNOperations recognizeMIDStreamFromPcm:self config:self.config sampleBuffer:sampleBuffer];
}

#pragma mark - UI methods

-(void)makeMyProgressBarMoving {

float actual = [self.progressBar progress];
if (actual < 1) {
[self.loadingAnimationView showNextLevel];
self.progressBar.progress = actual + 0.0125;
[NSTimer scheduledTimerWithTimeInterval:0.25f target:self selector:@selector(makeMyProgressBarMoving) userInfo:nil repeats:NO];
}
else{
self.progressBar.hidden = YES;
[self stopRecordMicrophone];
}

}

#pragma mark - GNSearchResultReady methods
- (void) GNResultReady:(GNSearchResult*)result{
NSLog(@"%s",__PRETTY_FUNCTION__);
}

@end

有关MicrophoneInput解决方案的信息,请转到Brian Whitman和Echo Nest Library。

希望它可以帮助面临同样情况的人。

干杯

关于ios - GNAudioSourceMic原始音频位置,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/23898280/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com