gpt4 book ai didi

ios - 从麦克风播放音频

转载 作者:搜寻专家 更新时间:2023-10-31 23:08:03 26 4
gpt4 key购买 nike

目标:将音频/视频从一台设备流式传输到另一台设备。

问题:我设法同时获得了音频和视频,但音频无法在另一端播放。


详细信息:

我创建了一个应用程序,可以通过网络将 A/V 数据从一台设备传输到另一台设备。为了不涉及太多细节,我将向您展示我被困在哪里。我设法听取了输出委托(delegate),我在其中提取音频信息,将其转换为数据并将其传递给我创建的委托(delegate)。

func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
// VIDEO | code excluded for simplicity of this question as this part works

// AUDIO | only deliver the frames if you are allowed to
if self.produceAudioFrames == true {
// process the audio buffer
let _audioFrame = self.audioFromSampleBuffer(sampleBuffer)
// process in async
DispatchQueue.main.async {
// pass the audio frame to the delegate
self.delegate?.audioFrame(data: _audioFrame)
}
}
}

转换 SampleBuffer 的辅助函数(不是我的代码,找不到源代码。我知道在 SO 上找到了它):

func audioFromSampleBuffer(_ sampleBuffer: CMSampleBuffer) -> Data {

var audioBufferList = AudioBufferList()
var data = Data()
var blockBuffer : CMBlockBuffer?

CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer,
nil,
&audioBufferList,
MemoryLayout<AudioBufferList>.size,
nil,
nil,
0,
&blockBuffer)

let buffers = UnsafeBufferPointer<AudioBuffer>(start: &audioBufferList.mBuffers,
count: Int(audioBufferList.mNumberBuffers))
for audioBuffer in buffers {
let frame = audioBuffer.mData?.assumingMemoryBound(to: UInt8.self)
data.append(frame!, count: Int(audioBuffer.mDataByteSize))
}
// dev
//print("audio buffer count: \(buffers.count)") | this returns 2048
// give the raw data back to the caller
return data
}

注意:在通过网络发送之前,我像这样转换从辅助函数返回的数据:let payload = Array(data)

那是主人的一面。

在客户端,我收到了 [UInt8] 的有效负载,这就是我被卡住的地方。我尝试了多种方法,但都没有用。

func processIncomingAudioPayloadFromFrame(_ ID: String, _ _Data: [UInt8]) {
let readableData = Data(bytes: _Data) // back from array to the data before we sent it over the network.
print(readableData.count) // still 2048 even after recieving from network, So I am guessing data is still intact

let x = self.bytesToAudioBuffer(_Data) // option two convert into a AVAudioPCMBuffer
print(x) // prints | <AVAudioPCMBuffer@0x600000201e80: 2048/2048 bytes> | I am guessing it works

// option one | play using AVAudioPlayer
do {
let player = try AVAudioPlayer(data: readableData)
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback)
try AVAudioSession.sharedInstance().setActive(true)
player.prepareToPlay()
player.play()
print(player.volume) // doing this to see if this is reached
}catch{
print(error) // gets error | Error Domain=NSOSStatusErrorDomain Code=1954115647 "(null)"
}
}

这是将 [UInt8] 转换为 AVAudioPCMBuffer 的辅助函数:

func bytesToAudioBuffer(_ buf: [UInt8]) -> AVAudioPCMBuffer {
// format assumption! make this part of your protocol?
let fmt = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 44100,
channels: 1, interleaved: true)
let frameLength = UInt32(buf.count) / fmt.streamDescription.pointee.mBytesPerFrame

let audioBuffer = AVAudioPCMBuffer(pcmFormat: fmt, frameCapacity: frameLength)
audioBuffer.frameLength = frameLength

let dstLeft = audioBuffer.floatChannelData![0]
// for stereo
// let dstRight = audioBuffer.floatChannelData![1]

buf.withUnsafeBufferPointer {
let src = UnsafeRawPointer($0.baseAddress!).
bindMemory(to: Float.self, capacity: Int(frameLength))
dstLeft.initialize(from: src, count: Int(frameLength))
}
return audioBuffer
}

问题:

  1. 是否可以直接从 [UInt8] 播放?
  2. 如何使用 AudioEngine 播放 AVAudioPCMBuffer 负载?可能吗?
  3. 如何在客户端播放音频。

脚注:我希望代码中的注释可以为您提供一些输出提示。我也不想保存到文件或任何相关文件,因为我只想放大麦克风以进行实时收听,我对保存数据没有兴趣。

最佳答案

我使用了相同的代码,用于在运营商调用时播放音频文件。

请尝试并告诉我结果:

目标代码:

NSString *soundFilePath = [[NSBundle mainBundle] 
pathForResource:self.bgAudioFileName ofType: @"mp3"];

NSURL *fileURL = [[NSURL alloc] initFileURLWithPath:soundFilePath ];

myAudioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:fileURL
error:nil];

myAudioPlayer.numberOfLoops = -1;

NSError *sessionError = nil;

// Change the default output audio route
AVAudioSession *audioSession = [AVAudioSession sharedInstance];

// get your audio session somehow
[audioSession setCategory:AVAudioSessionCategoryMultiRoute
error:&sessionError];

BOOL success= [audioSession
overrideOutputAudioPort:AVAudioSessionPortOverrideNone
error:&sessionError];

[audioSession setActive:YES error:&sessionError];
if(!success)
{
NSLog(@"error doing outputaudioportoverride - %@", [sessionError
localizedDescription]);
}
[myAudioPlayer setVolume:1.0f];
[myAudioPlayer play];

快速版本:

 var soundFilePath: String? = Bundle.main.path(forResource: 
bgAudioFileName, ofType: "mp3")
var fileURL = URL(fileURLWithPath: soundFilePath ?? "")
myAudioPlayer = try? AVAudioPlayer(contentsOf: fileURL)
myAudioPlayer.numberOfLoops = -1
var sessionError: Error? = nil
// Change the default output audio route
var audioSession = AVAudioSession.sharedInstance()
// get your audio session somehow
try? audioSession.setCategory(AVAudioSessionCategoryMultiRoute)
var success: Bool? = try?
audioSession.overrideOutputAudioPort(AVAudioSessionPortOverrideNone
as? AVAudioSessionPortOverride ?? AVAudioSessionPortOverride())
try? audioSession.setActive(true)
if !(success ?? false) {
print("error doing outputaudioportoverride - \
(sessionError?.localizedDescription)")
}
myAudioPlayer.volume = 1.0
myAudioPlayer.play()

关于ios - 从麦克风播放音频,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/48164407/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com