gpt4 book ai didi

ios - 合并中的音频不起作用

转载 作者:行者123 更新时间:2023-11-30 14:07:24 24 4
gpt4 key购买 nike

我用这个作为引用:Concatenate two audio files in Swift and play them

我正在尝试创建一个闹钟,由于苹果政策的限制,你无法在过去 10 分钟的背景中执行代码,如果我选择退出背景模式并将我的应用程序留在前台,则“声音”会暂停,如果用户单击主页按钮,我需要声音继续播放,直到用户执行某些操作。由于静音/请勿打扰按钮,UILocalNotifs 不起作用。所以我想运行空白音频直到闹钟时间,然后播放声音。

所以我使用以下命令来运行它:

这是单击保存按钮时发生的情况(用户选择了闹钟时间)

 let seconds = Double(comp.second)
let notification = UILocalNotification()
notification.alertBody = "testBody"
notification.fireDate = dueDatePicker.date
notification.alertTitle = "testTitle"
println("seconds:\(seconds)")



var results:NSArray = managedObjectContext!.executeFetchRequest(request, error: &error)!

let audioURL1 = NSBundle.mainBundle().URLForResource("alarm", withExtension: "m4a")!
let audioURL2 = NSBundle.mainBundle().URLForResource("music", withExtension: "mp3")!
println(audioURL1)
println(audioURL2)
println(task.uuid)

mergeAudio2(audioURL1, audioURL2: audioURL2, time: seconds, uuid: task.uuid)

这是合并音频部分:

func mergeAudio2(audioURL: NSURL, audioURL2: NSURL, time:Double, uuid:String) {
var error:NSError?

var ok1 = false
var ok2 = false


//var documentsDirectory:String = paths[0] as! String

//Create AVMutableComposition Object.This object will hold our multiple AVMutableCompositionTrack.
var composition = AVMutableComposition()
var compositionAudioTrack1:AVMutableCompositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID())
var compositionAudioTrack2:AVMutableCompositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID())

//create new file to receive data
var documentDirectoryURL = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask).first! as! NSURL
var fileDestinationUrl = documentDirectoryURL.URLByAppendingPathComponent("resultmerge.wav")
println(fileDestinationUrl)

var file = "resultmerge.m4a"
var dirs : [String] = (NSSearchPathForDirectoriesInDomains(NSSearchPathDirectory.DocumentDirectory, NSSearchPathDomainMask.AllDomainsMask, true) as? [String])!
var dir = dirs[0] //documents directory
var path = dir.stringByAppendingPathComponent(file)
var pathURLarray:Array = (NSURL(fileURLWithPath: path)!).pathComponents!
var pathURL:String = ""
var final = ""
var debut = ""

for i in 1...(pathURLarray.count-1) {
if i == pathURLarray.count-1 {
final = ""
} else {
final = "/"
}
if i == 1 {
debut = "/"
} else {
debut = ""
}
pathURL = debut + pathURL + (pathURLarray[i] as! String) + final
}

var checkValidation = NSFileManager.defaultManager()
if checkValidation.fileExistsAtPath(pathURL) {
println("file exist")
if NSFileManager.defaultManager().removeItemAtURL(fileDestinationUrl, error: nil) {
println("delete")
}
} else {
println("no file")
}

var url1 = audioURL
var url2 = audioURL2


var avAsset1 = AVURLAsset(URL: url1, options: nil)
var avAsset2 = AVURLAsset(URL: url2, options: nil)

var tracks1 = avAsset1.tracksWithMediaType(AVMediaTypeAudio)
var tracks2 = avAsset2.tracksWithMediaType(AVMediaTypeAudio)

var assetTrack1:AVAssetTrack = tracks1[0] as! AVAssetTrack
var assetTrack2:AVAssetTrack = tracks2[0] as! AVAssetTrack


var duration1: CMTime = assetTrack1.timeRange.duration
var duration2: CMTime = assetTrack2.timeRange.duration

var timeRange1 = CMTimeRangeMake(kCMTimeZero, duration1)
var timeRange2 = CMTimeRangeMake(duration1, duration2)


ok1 = compositionAudioTrack1.insertTimeRange(timeRange1, ofTrack: assetTrack1, atTime: kCMTimeZero, error: nil)
if ok1 {

ok2 = compositionAudioTrack2.insertTimeRange(timeRange2, ofTrack: assetTrack2, atTime: duration1, error: nil)

if ok2 {
println("success")
}
}

//AVAssetExportPresetPassthrough => concatenation


var assetExport = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetAppleM4A)
assetExport.outputFileType = AVFileTypeAppleM4A
assetExport.outputURL = fileDestinationUrl
assetExport.exportAsynchronouslyWithCompletionHandler({
switch assetExport.status{
case AVAssetExportSessionStatus.Failed:
println("failed \(assetExport.error)")
case AVAssetExportSessionStatus.Cancelled:
println("cancelled \(assetExport.error)")
default:
println("complete")
var audioPlayer = AVAudioPlayer()
AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, error: nil)
//audioPlayer.delegate = self

audioPlayer = AVAudioPlayer(contentsOfURL: fileDestinationUrl, error: nil)
println(fileDestinationUrl)
audioPlayer.prepareToPlay()
audioPlayer.play()
}

})


}

应用程序加载,一切正常,我得到了文件的 println,我得到了无文件、成功和完成的打印输出,这意味着它全部执行了。但没有声音播放

最佳答案

But no sound plays

因为audioPlayer是一个局部变量,并且在它有机会播放任何内容之前就消失了。将其提升为实例变量以使其持续存在。

关于ios - 合并中的音频不起作用,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/32193929/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com