gpt4 book ai didi

ios - 将 Objective-C 转换为 Swift 时如何处理 Protocol Delegate

转载 作者:行者123 更新时间:2023-11-30 13:40:01 24 4
gpt4 key购买 nike

我正在尝试将语音识别代码转换为 Swift,在 ViewController.h 中定义的协议(protocol)如下:

@interface ViewController : UIViewController<SpeechRecognitionProtocol>
{
NSMutableString* textOnScreen;
DataRecognitionClient* dataClient;
MicrophoneRecognitionClient* micClient;
SpeechRecognitionMode recoMode;
bool isMicrophoneReco;
bool isIntent;
int waitSeconds;
}

我在 ViewController.h 中转换以下函数时遇到了困难:

micClient = [SpeechRecognitionServiceFactory createMicrophoneClient:(recoMode)
withLanguage:(language)
withKey:(primaryOrSecondaryKey)
withProtocol:(self)];

该函数在 SpeechSDK.framework 中定义为:

@interface SpeechRecognitionServiceFactory : NSObject
/*
@param delegate The protocol used to perform the callbacks/events upon during speech recognition.
*/
+(MicrophoneRecognitionClient*)createMicrophoneClient:(SpeechRecognitionMode)speechRecognitionMode
withLanguage:(NSString*)language
withKey:(NSString*)primaryOrSecondaryKey
withProtocol:(id<SpeechRecognitionProtocol>)delegate;
@end

这个协议(protocol)在我转换后的 ViewController.Swift 中看起来像这样:

import UIKit    
protocol SpeechRecognitionProtocol {
func onIntentReceived(result: IntentResult)
func onPartialResponseReceived(response: String)
func onFinalResponseReceived(response: RecognitionResult)
func onError(errorMessage: String, withErrorCode errorCode: Int)
func onMicrophoneStatus(recording: DarwinBoolean)
func initializeRecoClient()
}

class ViewController: UIViewController, SpeechRecognitionProtocol {
var myDelegate: SpeechRecognitionProtocol?

最后我在 ViewController.swift 中调用这个函数。我在 withProtocol 之后收到以下错误:无法将“SpeechRecognitionProtocol.Protocol”类型的值转换为预期参数类型“SpeechRecognitionProtocol!”:

func initializeRecoClient() {
let language: String = "en-us"
let path: String = NSBundle.mainBundle().pathForResource("settings", ofType: "plist")!
let settings = NSDictionary(contentsOfFile: path)
let primaryOrSecondaryKey = settings?.objectForKey("primaryKey") as! String

micClient = SpeechRecognitionServiceFactory.createMicrophoneClient(recoMode!,
withLanguage: language,
withKey: primaryOrSecondaryKey,
withProtocol: SpeechRecognitionProtocol)
}

最佳答案

您不应该自己声明 SpeechRecognitionProtocol (不确定您添加它只是为了演示目的还是您的代码中是否确实有它)。 SpeechRecognitionProtocol 已在 SpeechRecognitionService.h 中声明并可用于 Swift - 这是您需要使用的协议(protocol)。

实现该协议(protocol)的对象是ViewController。假设您的 initializeRecoClient 是该类的方法,则调用需要如下所示:

micClient = SpeechRecognitionServiceFactory
.createMicrophoneClient(recoMode!,
withLanguage: language,
withKey: primaryOrSecondaryKey,
withProtocol: self)

SpeechSDK API 没有为该工厂方法选择一个特别好的名称。withProtocol 参数并不采用协议(protocol)对象本身(顾名思义),而是采用实现协议(protocol)的对象(显然)。

P.S.:不确定您使用的 SpeechAPI 版本,我必须实现这些 Swift 方法以使 ViewController 符合 SpeechRecognitionProtocol:

func onPartialResponseReceived(response: String!) {}
func onFinalResponseReceived (response: RecognitionResult) {}
func onError (errorMessage: String!,
withErrorCode errorCode: Int32) {}
func onMicrophoneStatus (recording: Bool) {}
func onIntentReceived (result: IntentResult) {}

关于ios - 将 Objective-C 转换为 Swift 时如何处理 Protocol Delegate,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/35683770/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com