gpt4 book ai didi

swift - 线程 1 :BAD_EXC_INSTRUCTION (code=EXC_1386_INVOP, 子代码=0x0)

转载 作者:行者123 更新时间:2023-11-28 08:25:02 26 4
gpt4 key购买 nike

我的代码贴在下面,不确定是什么问题,但上面列出了返回错误以及 fatal error :

unexpectedly found nil while unwrapping an Optional value

我看到了类似错误的其他答案,但所有这些都涉及诸如 if let 之类的事情,这似乎不是我的错误发生的地方,错误消息连接到靠近顶部的一行,上面写着“audioFile = try AVAudioFile(forReading: recordedAudioURL as URl)"

//
// PlaySoundsViewController+Audio.swift
// PitchPerfect
//
// Copyright © 2016 Udacity. All rights reserved.
//
import UIKit
import AVFoundation

extension PlaySoundsViewController: AVAudioPlayerDelegate {
struct Alerts {
static let DismissAlert = "Dismiss"
static let RecordingDisabledTitle = "Recording Disabled"
static let RecordingDisabledMessage = "You've disabled this app from recording your microphone. Check Settings."
static let RecordingFailedTitle = "Recording Failed"
static let RecordingFailedMessage = "Something went wrong with your recording."
static let AudioRecorderError = "Audio Recorder Error"
static let AudioSessionError = "Audio Session Error"
static let AudioRecordingError = "Audio Recording Error"
static let AudioFileError = "Audio File Error"
static let AudioEngineError = "Audio Engine Error"
}

// raw values correspond to sender tags
enum PlayingState { case Playing, NotPlaying }


// MARK: Audio Functions

func setupAudio() {
// initialize (recording) audio file
do {
audioFile = try AVAudioFile(forReading: recordedAudioURL as URL)
} catch {
showAlert(title: Alerts.AudioFileError, message: String(describing: error))
}
print("Audio has been setup")
}

func playSound(rate: Float? = nil, pitch: Float? = nil, echo: Bool = false, reverb: Bool = false) {

// initialize audio engine components
audioEngine = AVAudioEngine()

// node for playing audio
audioPlayerNode = AVAudioPlayerNode()
audioEngine.attach(audioPlayerNode)

// node for adjusting rate/pitch
let changeRatePitchNode = AVAudioUnitTimePitch()
if let pitch = pitch {
changeRatePitchNode.pitch = pitch
}
if let rate = rate {
changeRatePitchNode.rate = rate
}
audioEngine.attach(changeRatePitchNode)

// node for echo
let echoNode = AVAudioUnitDistortion()
echoNode.loadFactoryPreset(.multiEcho1)
audioEngine.attach(echoNode)

// node for reverb
let reverbNode = AVAudioUnitReverb()
reverbNode.loadFactoryPreset(.cathedral)
reverbNode.wetDryMix = 50
audioEngine.attach(reverbNode)

// connect nodes
if echo == true && reverb == true {
connectAudioNodes(nodes: audioPlayerNode, changeRatePitchNode, echoNode, reverbNode, audioEngine.outputNode)
} else if echo == true {
connectAudioNodes(nodes: audioPlayerNode, changeRatePitchNode, echoNode, audioEngine.outputNode)
} else if reverb == true {
connectAudioNodes(nodes: audioPlayerNode, changeRatePitchNode, reverbNode, audioEngine.outputNode)
} else {
connectAudioNodes(nodes: audioPlayerNode, changeRatePitchNode, audioEngine.outputNode)
}

// schedule to play and start the engine!
audioPlayerNode.stop()
audioPlayerNode.scheduleFile(audioFile, at: nil) {

var delayInSeconds: Double = 0

if let lastRenderTime = self.audioPlayerNode.lastRenderTime, let playerTime = self.audioPlayerNode.playerTime(forNodeTime: lastRenderTime) {

if let rate = rate {
delayInSeconds = Double(self.audioFile.length - playerTime.sampleTime) / Double(self.audioFile.processingFormat.sampleRate) / Double(rate)
} else {
delayInSeconds = Double(self.audioFile.length - playerTime.sampleTime) / Double(self.audioFile.processingFormat.sampleRate)
}
}

// schedule a stop timer for when audio finishes playing
self.stopTimer = Timer(timeInterval: delayInSeconds, target: self, selector: #selector(PlaySoundsViewController.stopAudio), userInfo: nil, repeats: false)
RunLoop.main.add(self.stopTimer!, forMode: RunLoopMode.defaultRunLoopMode)
}

do {
try audioEngine.start()
} catch {
showAlert(title: Alerts.AudioEngineError, message: String(describing: error))
return
}

// play the recording!
audioPlayerNode.play()
}


// MARK: Connect List of Audio Nodes

func connectAudioNodes(nodes: AVAudioNode...) {
for x in 0..<nodes.count-1 {
audioEngine.connect(nodes[x], to: nodes[x+1], format: audioFile.processingFormat)
}
}

func stopAudio() {

if let stopTimer = stopTimer {
stopTimer.invalidate()
}

configureUI(playState: .NotPlaying)

if let audioPlayerNode = audioPlayerNode {
audioPlayerNode.stop()
}

if let audioEngine = audioEngine {
audioEngine.stop()
audioEngine.reset()
}
}


// MARK: UI Functions

func configureUI(playState: PlayingState) {
switch(playState) {
case .Playing:
setPlayButtonsEnabled(enabled: false)
stopplaybackButton.isEnabled = true
case .NotPlaying:
setPlayButtonsEnabled(enabled: true)
stopplaybackButton.isEnabled = false
}
}

func setPlayButtonsEnabled(enabled: Bool) {
snailButton.isEnabled = enabled
chipmunkButton.isEnabled = enabled
rabbitButton.isEnabled = enabled
vaderButton.isEnabled = enabled
echoButton.isEnabled = enabled
reverbButton.isEnabled = enabled
}


func showAlert(title: String, message: String) {
let alert = UIAlertController(title: title, message: message, preferredStyle: .alert)
alert.addAction(UIAlertAction(title: Alerts.DismissAlert, style: .default, handler: nil))
self.present(alert, animated: true, completion: nil)
}


}

最佳答案

我正在学习相同的 Udacity 类(class),我相信这是因为 mic is not supported in the iOS 8.0+ simulator .一种可能的解决方法是将此行放入 PlaySoundsViewController 的 viewDidLoad() 函数中:

recordedAudioURL = Bundle.main.url(forResource: "yourSound", withExtension: "mp3")

...然后将名为“yourSound.mp3”的 mp3 拖到您的 PitchPerfect 项目中。这是一个 hack,但它会阻止崩溃并允许您测试该 mp3 上的音频修改按钮,但如果您想在应用程序中录制自己的样本,则需要使用实际的 iOS 设备。

编辑:我再次检查了我的整个代码库并检查了我的音频设置。在代码库中,我在 PlaySoundsViewController 中错误地放置了一个方法(为 segue 做准备),对于音频设置,一旦我将麦克风设置为我的影院显示音频,录音就可以在模拟器中工作。祝你好运。

关于swift - 线程 1 :BAD_EXC_INSTRUCTION (code=EXC_1386_INVOP, 子代码=0x0),我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/40295216/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com