- iOS/Objective-C 元类和类别
- objective-c - -1001 错误,当 NSURLSession 通过 httpproxy 和/etc/hosts
- java - 使用网络类获取 url 地址
- ios - 推送通知中不播放声音
在我学习使用 AudioKit 并在更大的应用程序中进行扩展的早期,我采纳了 AudioKit should be effectively be a global singleton. 的标准建议我设法构建了一个非常复杂的原型(prototype),一切都很好。
一旦我开始扩大规模并接近实际发布。我们决定为我们的架构采用 MVVM,并尽量避免使用庞大的 AudioKit Singelton 来处理应用程序中音频需求的各个方面。简而言之,MVVM 非常优雅,并且明显清理了我们的代码库。
与我们的 AudioKit 结构直接相关,它是这样的:
AudioKit 和 AKMixer
驻留在 Singelton 实例中,并具有允许各种 View 模型和我们的其他音频模型附加和分离各种节点(AKPlayer
, AKSampler
等...)。在我完成的最小测试中,我可以确认它在我的 AKPlayer 模块中尝试时有效,并且效果很好。
我遇到了一个问题,在我的一生中,尽管实际的代码实现是与我的工作原型(prototype)相同。
我担心的是我是否认为我可以模块化 AudioKit 以及需要连接到它的各种节点和组件做错了事,或者 AKNodeOutputPlot
是否有我不知道的特殊要求。
以下是我可以提供的最简短的代码片段,而不会压倒问题:
AudioKit Singelton(在 AppDelegate 中调用):
import Foundation
import AudioKit
class AudioKitConfigurator
{
static let shared: AudioKitConfigurator = AudioKitConfigurator()
private let mainMixer: AKMixer = AKMixer()
private init()
{
makeMainMixer()
configureAudioKitSettings()
startAudioEngine()
}
deinit
{
stopAudioEngine()
}
private func makeMainMixer()
{
AudioKit.output = mainMixer
}
func mainMixer(add node: AKNode)
{
mainMixer.connect(input: node)
}
func mainMixer(remove node: AKNode)
{
node.detach()
}
private func configureAudioKitSettings()
{
AKAudioFile.cleanTempDirectory()
AKSettings.defaultToSpeaker = true
AKSettings.playbackWhileMuted = true
AKSettings.bufferLength = .medium
do
{
try AKSettings.setSession(category: .playAndRecord, with: .allowBluetoothA2DP)
}
catch
{
AKLog("Could not set session category.")
}
}
private func startAudioEngine()
{
do
{
try AudioKit.start()
}
catch
{
AKLog("Fatal Error: AudioKit did not start!")
}
}
private func stopAudioEngine()
{
do
{
try AudioKit.stop()
}
catch
{
AKLog("Fatal Error: AudioKit did not stop!")
}
}
}
麦克风组件:
import Foundation
import AudioKit
import AudioKitUI
enum MicErrorsToThrow: String, Error
{
case recordingTooShort = "The recording was too short, just silently failing"
case audioFileFailedToUnwrap = "The Audio File failed to Unwrap from the recorder"
case recorderError = "The Recorder was unable to start recording."
case recorderCantReset = "In attempt to reset the recorder, it was unable to"
}
class Microphone
{
private var mic: AKMicrophone = AKMicrophone()
private var micMixer: AKMixer = AKMixer()
private var micBooster: AKBooster = AKBooster()
private var recorder: AKNodeRecorder!
private var recordingTimer: Timer
init()
{
micMixer = AKMixer(mic)
micBooster = AKBooster(micMixer)
micBooster.gain = 0
recorder = try? AKNodeRecorder(node: micMixer)
//TODO: Need to finish the recording timer implementation, leaving blank for now
recordingTimer = Timer(timeInterval: 120, repeats: false, block: { (timer) in
})
AudioKitConfigurator.shared.mainMixer(add: micBooster)
}
deinit {
// removeComponent()
}
public func removeComponent()
{
AudioKitConfigurator.shared.mainMixer(remove: micBooster)
}
public func reset() throws
{
if recorder.isRecording
{
recorder.stop()
}
do
{
try recorder.reset()
}
catch
{
AKLog("Recorder can't reset!")
throw MicErrorsToThrow.recorderCantReset
}
}
public func setHeadphoneMonitoring()
{
// microphone will be monitored while recording
// only if headphones are plugged
if AKSettings.headPhonesPlugged {
micBooster.gain = 1
}
}
/// Start recording from mic, call this function when using in conjunction with a AKNodeOutputPlot so that it can display the waveform in realtime while recording
///
/// - Parameter waveformPlot: AKNodeOutputPlot view object which displays waveform from recording
/// - Throws: Only error to throw is from recorder property can't start recording, something wrong with microphone. Enum is MicErrorsToThrow.recorderError
public func record(waveformPlot: AKNodeOutputPlot) throws
{
waveformPlot.node = mic
do
{
try recorder.record()
// self.recordingTimer.fire()
}
catch
{
print("Error recording!")
throw MicErrorsToThrow.recorderError
}
}
/// Stop the recorder, and get the recording as an AKAudioFile, necessary to call if you are using AKNodeOutputPlot
///
/// - Parameter waveformPlot: AKNodeOutputPlot view object which displays waveform from recording
/// - Returns: AKAudioFile
/// - Throws: Two possible errors, recording was too short (right now is 0.0, but should probably be like 0.5 secs), or could not retrieve audio file from recorder, MicErrorsToThrow.audioFileFailedToUnwrap, MicErrorsToThrow.recordingTooShort
public func stopRecording(waveformPlot: AKNodeOutputPlot) throws -> AKAudioFile
{
waveformPlot.pause()
waveformPlot.node = nil
recordingTimer.invalidate()
if let tape = recorder.audioFile
{
if tape.duration > 0.0
{
recorder.stop()
AKLog("Printing tape: CountOfFloatChannelData:\(tape.floatChannelData?.first?.count) | maxLevel:\(tape.maxLevel)")
return tape
}
else
{
//TODO: This should be more gentle than an NSError, it's just that they managed to tap the buttona and tap again to record nothing, honestly duration should probbaly be like 0.5, or 1.0 even. But let's return some sort of "safe" error that doesn't require UI
throw MicErrorsToThrow.recordingTooShort
}
}
else
{
//TODO: need to return error here, could not recover audioFile from recorder
AKLog("Can't retrieve or unwrap audioFile from recorder!")
throw MicErrorsToThrow.audioFileFailedToUnwrap
}
}
}
现在,在我的 VC 中,AKNodeOutputPlot
是 Storybard 上的一个 View ,并通过 IBOutlet
连接。它呈现在屏幕上,根据我的喜好进行了风格化,并且它绝对是连接和工作的。在 VC/VM 中还有我的 Microphone
组件的一个实例属性。我的想法是,在录制时,我们会将 nodeOutput 对象传递给 ViewModel,然后它会调用 Microphone
的 record(waveformPlot: AKNodeOutputPlot)
函数,然后 code>waveformPlot.node = mic
足以连接它们。遗憾的是,情况并非如此。
查看:
class ComposerVC: UIViewController, Storyboarded
{
var coordinator: MainCoordinator?
let viewModel: ComposerViewModel = ComposerViewModel()
@IBOutlet weak var recordButton: RecordButton!
@IBOutlet weak var waveformPlot: AKNodeOutputPlot! // Here is our waveformPlot object, again confirmed rendering and styled
// MARK:- VC Lifecycle Methods
override func viewDidLoad()
{
super.viewDidLoad()
setupNavigationBar()
setupConductorButton()
setupRecordButton()
}
func setupWaveformPlot() {
waveformPlot.plotType = .rolling
waveformPlot.gain = 1.0
waveformPlot.shouldFill = true
}
override func viewDidAppear(_ animated: Bool)
{
super.viewDidAppear(animated)
setupWaveformPlot()
self.didDismissComposerDetailToRootController()
}
// Upon touching the Record Button, it in turn will talk to ViewModel which will then call Microphone module to record and hookup waveformPlot.node = mic
@IBAction func tappedRecordView(_ sender: Any)
{
self.recordButton.recording.toggle()
self.recordButton.animateToggle()
self.viewModel.tappedRecord(waveformPlot: waveformPlot)
{ (waveformViewModel, error) in
if let waveformViewModel = waveformViewModel
{
self.segueToEditWaveForm()
self.performSegue(withIdentifier: "composerToEditWaveForm", sender: waveformViewModel)
//self.performSegue(withIdentifier: "composerToDetailSegue", sender: self)
}
}
}
View 模型:
import Foundation
import AudioKit
import AudioKitUI
class ComposerViewModel: ViewModelProtocol
{
//MARK:- Instance Variables
var recordingState: RecordingState
var mic: Microphone = Microphone()
init()
{
self.recordingState = .readyToRecord
}
func resetViewModel()
{
self.resetRecorder()
}
func resetRecorder()
{
do
{
try mic.reset()
}
catch let error as MicErrorsToThrow
{
switch error {
case .audioFileFailedToUnwrap:
print(error)
case .recorderCantReset:
print(error)
case .recorderError:
print(error)
case .recordingTooShort:
print(error)
}
}
catch {
print("Secondary catch in start recording?!")
}
recordingState = .readyToRecord
}
func tappedRecord(waveformPlot: AKNodeOutputPlot, completion: ((EditWaveFormViewModel?, Error?) -> ())? = nil)
{
switch recordingState
{
case .readyToRecord:
self.startRecording(waveformPlot: waveformPlot)
case .recording:
self.stopRecording(waveformPlot: waveformPlot, completion: completion)
case .finishedRecording: break
}
}
func startRecording(waveformPlot: AKNodeOutputPlot)
{
recordingState = .recording
mic.setHeadphoneMonitoring()
do
{
try mic.record(waveformPlot: waveformPlot)
}
catch let error as MicErrorsToThrow
{
switch error {
case .audioFileFailedToUnwrap:
print(error)
case .recorderCantReset:
print(error)
case .recorderError:
print(error)
case .recordingTooShort:
print(error)
}
}
catch {
print("Secondary catch in start recording?!")
}
}
我很乐意提供更多代码,但我只是不想浪费他们的时间。逻辑似乎很合理,我只是觉得我遗漏了一些明显的东西,或者是对 AudioKit + AKNodeOutputPlot + AKMicrohone 的完全误解。
欢迎任何想法,谢谢!
最佳答案
编辑AudioKit 4.6 修复了所有问题!强烈鼓励您的项目对 AudioKit 进行 MVVM/模块化!
====
所以经过大量的实验。我得出了一些结论:
在一个单独的项目中,我引入了我的 AudioKitConfigurator
和 Microphone
类,初始化它们,将它们连接到 AKNodeOutputPlot
和它运行完美。
在我的大型项目中,无论我做什么,我都无法让相同的类工作。
目前,我正在恢复到旧版本,慢慢添加组件,直到它再次崩溃,并将一个一个地更新架构,因为这个问题太复杂了,可能会与其他一些库交互。我还从 AudioKit 4.5.6 降级到 AudioKit 4.5.3。
这不是解决方案,而是目前唯一可行的解决方案。好消息是,完全有可能将 AudioKit 格式化为与 MVVM 架构一起工作。
关于ios - AudioKit : AKNodeOutputPlot and AKMicrophone not working, 可能是由于生命周期或 MVVM 架构决策,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/54208134/
我正在为 iOS 应用程序使用 AudioKit。现在一切对我来说都很好。但是我在自定义 AKNodeOutputPlot 动画时遇到了问题。使用 AKNodeOutputPlot 时如何提高它的速度
我使用所有适当的设置和分析器 (AKNodeOutputPlot) 进行麦克风记录器 (AKNodeRecorder),它通过记录显示波形并在记录停止后显示记录的文件图。一切正常。问题是我如何在情节中
是否可以在分析音频文件后删除 AKNodeOutputPlot 行?我查看了 API,但无论如何我都找不到将其 alpha 设置为 nil 或清除 UIColor() 的方法。 我目前在开头有这个,但
我是 AudioKit 框架的新手,我一直在尝试更多地了解它的 DSP 方面。在翻阅源代码时,我意识到 AKNodeOutputPlot 不会像其他人那样从节点中提取数据。 在 AKAmplitude
在我学习使用 AudioKit 并在更大的应用程序中进行扩展的早期,我采纳了 AudioKit should be effectively be a global singleton. 的标准建议我设
我是一名优秀的程序员,十分优秀!