gpt4 book ai didi

swift - 在将 Swift 2 转换为 Swift 3 期间寻找 "pointee"的替代品时出现问题

转载 作者:搜寻专家 更新时间:2023-10-31 22:20:00 26 4
gpt4 key购买 nike

我一直致力于掌握 AUAudioUnits 背后的理念,并在 presentation video 中给出的 Xcode 中写下示例代码来自介绍该主题的 Apple 的 WWDC 2016。事实证明,这段代码是为 Swift 2 编写的,而 Swift 3 引入了一种新的指针处理方式(如 herehere 所示)。现在我对使用 Swift 编程还很陌生,不熟悉它的一些概念,而且我不知道如何手动执行从 Swift 2 到 Swift 3 的转换。即使使用build设置

Use Legacy Swift Language Version = yes

我无法让它运行。

这是 Swift 2 的代码,与视频中的代码完全相同:

import Foundation
import AVFoundation


class SquareWaveGenerator {
let sampleRate: Double
let frequency: Double
let amplitude: Float

var counter: Double = 0.0

init(sampleRate: Double, frequency: Double, amplitude: Float) {
self.sampleRate = sampleRate
self.frequency = frequency
self.amplitude = amplitude
}

func render(buffer: AudioBuffer) {
let nframes = Int(buffer.mDataByteSize) / sizeof(Float)
var ptr = UnsafeMutablePointer<Float>(buffer.mData)

var j = self.counter
let cycleLength = self.sampleRate / self.frequency
let halfCycleLength = cycleLength / 2

let amp = self.amplitude, minusAmp = -amp

for _ in 0..<nframes {
if j < halfCycleLength {
ptr.pointee = amp
} else {
ptr.pointee = minusAmp
}
ptr = ptr.successor()
j += 1.0
if (j > cycleLength) {
j -= cycleLength
}
}

self.counter = j
}
}

func main() {
//Create an AudioComponentDescription for the input/output unit we want to use.
#if os(iOS)
let kOutputUnitSubType = kAudioUnitSubType_RemoteIO
#else
let kOutputUnitSubType = kAudioUnitSubType_HALOutput
#endif

let ioUnitDesc = AudioComponentDescription(
componentType: kAudioUnitType_Output,
componentSubType: kOutputUnitSubType,
componentManufacturer: kAudioUnitManufacturer_Apple,
componentFlags: 0,
componentFlagsMask: 0)

let ioUnit = try! AUAudioUnit(componentDescription: ioUnitDesc, options: AudioComponentInstantiationOptions())

/*
Set things up to render at the same sample rate as the hardware,
up to 2 channels. Note that the hardware format may not be a standard
format, so we make a separate render format with the same sample rate
and the desired channel count.
*/
let hardwareFormat = ioUnit.outputBusses[0].format
let renderFormat = AVAudioFormat(standardFormatWithSampleRate: hardwareFormat.sampleRate, channels: min(2,hardwareFormat.channelCount))

try! ioUnit.inputBusses[0].setFormat(renderFormat)

// Create square wave generators.
let generatorLeft = SquareWaveGenerator(sampleRate: renderFormat.sampleRate, frequency: 440.0, amplitude: 0.1)
let generatorRight = SquareWaveGenerator(sampleRate: renderFormat.sampleRate, frequency: 440.0, amplitude: 0.1)

// Install a block which will be called to render.
ioUnit.outputProvider = { (actionFlags: UnsafeMutablePointer<AudioUnitRenderActionFlags>, timestamp: UnsafePointer<AudioTimeStamp>, frameCount: AUAudioFrameCount, busIndex: Int, rawBufferList: UnsafeMutablePointer<AudioBufferList>) -> AUAudioUnitStatus in

let bufferList = UnsafeMutableAudioBufferListPointer(rawBufferList)
if bufferList.count > 0 {
generatorLeft.render(bufferList[0])
if bufferList.count > 1 {
generatorRight.render(bufferList[1])
}
}

return noErr

}

// Allocate render resources, then start the audio hardware.
try! ioUnit.allocateRenderResources()

try! ioUnit.startHardware()

sleep(3)
ioUnit.stopHardware()
}

main()

这段代码:

ptr.pointee = amp
[...]
ptr.pointee = minusAmp

抛出以下错误:

Value of type 'UnsafeMutablePointer' has no member 'pointee'

由于无法手动解决这个问题,我尝试手动将代码转换为 Swift 3,希望问题能得到解决。在这里:

import Foundation
import AVFoundation


class SquareWaveGenerator {
let sampleRate: Double
let frequency: Double
let amplitude: Float

var counter: Double = 0.0

init(sampleRate: Double, frequency: Double, amplitude: Float) {
self.sampleRate = sampleRate
self.frequency = frequency
self.amplitude = amplitude
}

func render(buffer: AudioBuffer) {
let nframes = Int(buffer.mDataByteSize) / MemoryLayout<Float>.size
var ptr = buffer.mData

var j = self.counter
let cycleLength = self.sampleRate / self.frequency
let halfCycleLength = cycleLength / 2

let amp = self.amplitude, minusAmp = -amp

for _ in 0..<nframes {
if j < halfCycleLength {
ptr?.pointee = amp
} else {
ptr?.pointee = minusAmp
}
ptr = ptr?.advanced(by: 1)
j += 1.0
if (j > cycleLength) {
j -= cycleLength
}
}

self.counter = j
}
}

func main() {
//Create an AudioComponentDescription for the input/output unit we want to use.
#if os(iOS)
let kOutputUnitSubType = kAudioUnitSubType_RemoteIO
#else
let kOutputUnitSubType = kAudioUnitSubType_HALOutput
#endif

let ioUnitDesc = AudioComponentDescription(
componentType: kAudioUnitType_Output,
componentSubType: kOutputUnitSubType,
componentManufacturer: kAudioUnitManufacturer_Apple,
componentFlags: 0,
componentFlagsMask: 0)

let ioUnit = try! AUAudioUnit(componentDescription: ioUnitDesc, options: AudioComponentInstantiationOptions())

/*
Set things up to render at the same sample rate as the hardware,
up to 2 channels. Note that the hardware format may not be a standard
format, so we make a separate render format with the same sample rate
and the desired channel count.
*/
let hardwareFormat = ioUnit.outputBusses[0].format
let renderFormat = AVAudioFormat(standardFormatWithSampleRate: hardwareFormat.sampleRate, channels: min(2,hardwareFormat.channelCount))

try! ioUnit.inputBusses[0].setFormat(renderFormat)

// Create square wave generators.
let generatorLeft = SquareWaveGenerator(sampleRate: renderFormat.sampleRate, frequency: 440.0, amplitude: 0.1)
let generatorRight = SquareWaveGenerator(sampleRate: renderFormat.sampleRate, frequency: 440.0, amplitude: 0.1)

// Install a block which will be called to render.
ioUnit.outputProvider = { (actionFlags: UnsafeMutablePointer<AudioUnitRenderActionFlags>, timestamp: UnsafePointer<AudioTimeStamp>, frameCount: AUAudioFrameCount, busIndex: Int, rawBufferList: UnsafeMutablePointer<AudioBufferList>) -> AUAudioUnitStatus in

let bufferList = UnsafeMutableAudioBufferListPointer(rawBufferList)
if bufferList.count > 0 {
generatorLeft.render(buffer: bufferList[0])
if bufferList.count > 1 {
generatorRight.render(buffer: bufferList[1])
}
}

return noErr

}

// Allocate render resources, then start the audio hardware.
try! ioUnit.allocateRenderResources()

try! ioUnit.startHardware()

sleep(3)
ioUnit.stopHardware()
}

main()

我在哪里再次遇到上述错误

Value of type 'UnsafeMutablePointer' has no member 'pointee'

最后,我想到了类似的东西

ptr?.storeBytes(of: T, as: T.Type)

应该能够取代“pointee”结构。如果我理解正确的话,“T”是我想存储在指针位置的值。就我而言,那将是“放大器”。 “amp”是 Float 类型。

但是无论我做什么,我都无法让代码运行。它只是不会接受任何类似的东西

ptr?.storeBytes(of: amp, as: Float())

throw

Cannot convert value of type 'Float' to expected argument type 'T.Type'

ptr?.storeBytes(of: amp, as: Float.self)

不再立即抛出错误并正确编译,但在运行时,收到 lldb 错误消息

fatal error: storeBytes to misaligned raw pointer

本质上,我不知道自己在做什么,不理解这种情况下“T.Type”的概念,我被困住了。所以我有两个问题:

1) How do I solve this issue and get the code running?

2) Where can I learn more about these types of constructions à la Type which will help me understand what they are and what they mean?

最佳答案

你在这里遇到的是 while AudioBuffer.mData曾经是 UnsafeMutablePointer , 它现在是 UnsafeMutableRawPointer ,这是 Swift 3 中的新功能。要像以前一样处理该数据,您可以将引用的内存绑定(bind)到 Float。类型,像这样:

guard let mData = buffer.mData 
else { return /* or error */ }
let nframes = Int(buffer.mDataByteSize) / MemoryLayout<Float>.size
var ptr = mData.bindMemory(to: Float.self, capacity: nframes)

现在ptr是一个 UnsafeMutablePointer<Float> ,这是您之前使用的,您应该能够访问它的 pointee属性(property)没有问题。

注意:每当您看到 T.Type在函数声明中,它要求类型本身,而不是类型的实例。在这种情况下,您要传递 Float 的类型,即 Float.self .打电话Float() ,另一方面,创建一个新的 Float实例。

最后,不再继续直接使用 ptr ,我会创建一个缓冲区,它至少会为您提供 Debug模式下的边界检查和更好的界面:

let buffer = UnsafeMutableBufferPointer<Float>(start: ptr, count: frames)
// ...
for i in 0..<nframes {
if j < halfCycleLength {
buffer[i] = amp
} else {
buffer[i] = minusAmp
}
j += 1.0
// ...
}

关于swift - 在将 Swift 2 转换为 Swift 3 期间寻找 "pointee"的替代品时出现问题,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/41308790/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com