gpt4 book ai didi

c# - Unity 与 azure 语音 sdk

转载 作者:行者123 更新时间:2023-12-03 03:38:50 25 4
gpt4 key购买 nike

当我在unity上使用azure语音sdk时,当我在计算机上测试它时,它工作正常,我可以说话,它识别并响应语音一切正常。

当我为 Android 和 iOS 构建时,它不起作用。在 iOS 和 Android 上,它都会提高识别点,而不会尝试识别任何内容,如果我只是输入来自 sdk 的简单语音,它也不会给出任何内容。

如何解决这个问题?

以下是在 Unity 和 Windows 构建中运行的代码:


---------------------------------------------
void Start()
{
anim = gameObject.GetComponent<Animator>();

var config = SpeechConfig.FromSubscription("xxxxxxxxxxxx", "northeurope");

cred(config);
}



async Task cred(SpeechConfig config)
{
texttest.GetComponent<Text>().text = config.ToString();

var audioConfig = AudioConfig.FromDefaultMicrophoneInput();

var synthesizer2 = new SpeechRecognizer(config, audioConfig);

var result = await synthesizer2.RecognizeOnceAsync();

var synthesizer = new SpeechSynthesizer(config);

SynthesizeAudioAsync(config, synthesizer2, result);
}

async Task SynthesizeAudioAsync(SpeechConfig config, SpeechRecognizer synthesizer2, SpeechRecognitionResult result)
{
texttest.GetComponent<Text>().text = "syn1 " + result.Text;

OutputSpeechRecognitionResult(result);
if (result.Reason == ResultReason.RecognizedSpeech)
{
if (result.Text == "xx" || result.Text == "xx" || result.Text == xx." || result.Text == "xx")
{

var synthesizer = new SpeechSynthesizer(config);

anim.Play("helloAll", 0, 0);
await synthesizer.SpeakTextAsync("Helloxx");
chooseTopic(config, synthesizer, result.Text);

在 iOS 上,它在控制台中提供了以下内容:


--------------------------------------------------
CANCELED: Did you set the speech resource key and region values?speakTest:OutputSpeechRecognitionResult(SpeechRecognitionResult)<SynthesizeAudioAsync>d__10:MoveNext()


CANCELED: ErrorDetails=0x15 (SPXERR_MIC_ERROR)

[CALL STACK BEGIN]



3 UnityFramework 0x0000000109336810 _ZN9Microsoft17CognitiveServices6Speech4Impl22CSpxMicrophonePumpBase9StartPumpENSt3__110shared_ptrINS2_18ISpxAudioProcessorEEE + 756

4 UnityFramework 0x000000010931c010 _ZN9Microsoft17CognitiveServices6Speech4Impl25ISpxDelegateAudioPumpImpl9StartPumpENSt3__110shared_ptrINS2_18ISpxAudioProcessorEEE + 84

5 UnityFramework 0x000000010932cc0c _ZN9Microsoft17CognitiveServices6Speech4Impl27CSpxAudioPumpDelegateHelperINS2_29CSpxDelegateToSharedPtrHelperINS2_13ISpxAudioPumpELb0EEEE17DelegateStartPumpENSt3__110shared_ptrINS2_18ISpxAudioProcessorEEE + 220

6 UnityFramework 0x0000000109325e1c _ZN9Microsoft17CognitiveServices6Speech4Impl41ISpxAudioSourceControlAdaptsAudioPumpImplINS2_32CSpxMicrophoneAudioSourceAdapterEE9StartPumpEv + 304

7 UnityFramework 0x0000000109325664 _ZN9Microsoft17CognitiveServices6Speech4Impl41ISpxAudioSourceControlAdaptsAudioPumpImplINS2_32CSpxMicrophoneAudioSourceAdapterEE10StartAudioENSt3__110shared_ptrINS2_12ISpxNotifyMeIJRKNS7_INS2_15ISpxAudioSourceEEERKNS7_INS2_14ISpxBufferDataEEEEEEEE + 184

8 UnityFramework 0x00000001093221d4 _ZN9Microsoft17CognitiveServices6Speech4Impl34ISpxAudioSourceControlDelegateImplINS2_29CSpxDelegateToSharedPtrHelperINS2_22ISpxAudioSourceControlELb0EEEE10StartAudioENSt3__110shared_ptrINS2_12ISpxNotifyMeIJRKNS9_INS2_15ISpxAudioSourceEEERKNS9_INS2_14ISpxBufferDataEEEEEEEE + 220

9 UnityFramework 0x00000001094a41f4 _ZN9Microsoft17CognitiveServices6Speech4Impl28CSpxSessionAudioSourceHelperINS2_20CSpxAudioSessionShimEE16StartAudioSourceERKNSt3__110shared_ptrINS2_15ISpxAudioSourceEEE + 504

10 UnityFramework 0x00000001094a0dbc _ZN9Microsoft17CognitiveServices6Speech4Impl28CSpxSessionAudioSourceHelperINS2_20CSpxAudioSessionShimEE22EnsureStartAudioSourceEv + 124

11 UnityFramework 0x0000000109408dcc _ZN9Microsoft17CognitiveServices6Speech4Impl22CSpxAudioStreamSession14StartAudioPumpENS3_15RecognitionKindENSt3__110shared_ptrINS2_12ISpxKwsModelEEE + 2300

12 UnityFramework 0x0000000109406760 _ZN9Microsoft17CognitiveServices6Speech4Impl22CSpxAudioStreamSession16StartRecognizingENS3_15RecognitionKindENSt3__110shared_ptrINS2_12ISpxKwsModelEEE + 616

13 UnityFramework 0x0000000109406098 _ZN9Microsoft17CognitiveServices6Speech4Impl22CSpxAudioStreamSession18RecognizeOnceAsyncERKNSt3__110shared_ptrINS3_9OperationEEENS5_INS2_12ISpxKwsModelEEE + 464

14 UnityFramework 0x0000000109424d4c _ZN9Microsoft17CognitiveServices6Speech4Impl22CSpxAudioStreamSession9OperationC2ENS3_15RecognitionKindE + 1040

15 UnityFramework 0x0000000109420af4 _ZN9Microsoft17CognitiveServices6Speech4Impl7SpxTermINS2_21ISpxAudioStreamReaderEEEvRKNSt3__110shared_ptrIT_EE + 2004

16 UnityFramework 0x0000000109354c28 _ZNSt3__113packaged_taskIFvvEEclEv + 96

17 UnityFramework 0x0000000109354bb4 _ZN9Microsoft17CognitiveServices6Speech4Impl17CSpxThreadService4Task3RunEv + 32

18 UnityFramework 0x00000001093566fc _ZN9Microsoft17CognitiveServices6Speech4Impl17CSpxThreadService6Thread7RunTaskINSt3__14pairINS6_10shared_ptrINS3_4TaskEEENS6_7promiseIbEEEEEEvRNS6_11unique_lockINS6_5mutexEEERNS6_5dequeIT_NS6_9allocatorISJ_EEEE + 332

19 UnityFramework 0x0000000109354d8c _ZN9Microsoft17CognitiveServices6Speech4Impl17CSpxThreadService6Thread10WorkerLoopENSt3__110shared_ptrIS4_EE + 216

[CALL STACK END]

最佳答案

此问题是由于 iOS 和 Android 中的配置问题而引发的。检查 android 中的配置文档和 iOS .

有一个 github 存储库可以解决类似的问题。检查一次。

https://github.com/Azure-Samples/cognitive-services-speech-sdk/tree/master/quickstart/csharp/unity/from-microphone

iOS 库将使用 Objective - C 开发,不支持任何类型的 JavaScript 库。同样,对于 Android Cordova 将用于处理 JavaScript 库。这两个平台对于每个相反的平台都是不支持的。因此,请检查开发平台的配置以及支持库语言。

Cordova Platforms : android 7.1.4 ios 4.5.5
Ionic Framework : ionic-angular 3.9.2
iOS: Objective - C

关于c# - Unity 与 azure 语音 sdk,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/72233781/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com