ios - 如何在 AudioKit 中渲染和导出音频?

标签 ios swift audio export audiokit

我正在构建一个关于音乐的项目,我正在使用 AudioKit 框架。 AudioKit 非常有用,但我现在需要导出在应用程序中创建的声音。

我正在使用 AKSequencer 按顺序播放音符,我还在声音上应用了一些过滤器,例如混响。

我找到了一个示例代码 here从另一个音频文件导出音频,但这不是我需要的。实际上,我需要渲染笔记序列,包括过滤器,并将其导出给用户。

我的代码:

class Sequencer {

    let oscBank = AKOscillatorBank(waveform: AKTable(AKTableType.positiveReverseSawtooth))//1
    let sequencer = AKSequencer()

    init() {
        setup()
    }

    func setup() {

        // I instantiate a MIDI node with the oscillator bank to put it in the track output.
        // It makes the oscillator's filters are recognized in the track output.
        let midiNode = AKMIDINode(node: oscBank)
        _ = sequencer.newTrack()
        sequencer.tracks[0].setMIDIOutput(midiNode.midiIn)

        generateSequence()

        // Here I'm applying some filter, a reverb in this case.
        let reverb = AKReverb(oscBank)
        reverb.loadFactoryPreset(.plate)

        // The Audiokit's output may be the last filter applyed. It is working to play.
        AudioKit.output = AKMixer(reverb)
        try? AudioKit.start()
    }

    func play() {
        sequencer.play()
    }

    func stop() {
        sequencer.stop()
    }


    /// Generates some melody (Sweet Child of Mine)
    func generateSequence() {
        for _ in 0..<2 {
            sequencer.tracks[0].add(noteNumber: 62, velocity: 127, position: AKDuration(beats: 0), duration: AKDuration(beats: 0.5))
            sequencer.tracks[0].add(noteNumber: 74, velocity: 127, position: AKDuration(beats: 0.5), duration: AKDuration(beats: 0.5))
            sequencer.tracks[0].add(noteNumber: 69, velocity: 127, position: AKDuration(beats: 1), duration: AKDuration(beats: 0.5))
            sequencer.tracks[0].add(noteNumber: 67, velocity: 127, position: AKDuration(beats: 1.5), duration: AKDuration(beats: 0.5))
            sequencer.tracks[0].add(noteNumber: 79, velocity: 127, position: AKDuration(beats: 2), duration: AKDuration(beats: 0.5))
            sequencer.tracks[0].add(noteNumber: 69, velocity: 127, position: AKDuration(beats: 2.5), duration: AKDuration(beats: 0.5))
            sequencer.tracks[0].add(noteNumber: 78, velocity: 127, position: AKDuration(beats: 3), duration: AKDuration(beats: 0.5))
            sequencer.tracks[0].add(noteNumber: 69, velocity: 127, position: AKDuration(beats: 3.5), duration: AKDuration(beats: 0.5))
            sequencer.tracks[0].add(noteNumber: 62, velocity: 127, position: AKDuration(beats: 4), duration: AKDuration(beats: 0.5))
            sequencer.tracks[0].add(noteNumber: 74, velocity: 127, position: AKDuration(beats: 4.5), duration: AKDuration(beats: 0.5))
            sequencer.tracks[0].add(noteNumber: 69, velocity: 127, position: AKDuration(beats: 5), duration: AKDuration(beats: 0.5))
            sequencer.tracks[0].add(noteNumber: 67, velocity: 127, position: AKDuration(beats: 5.5), duration: AKDuration(beats: 0.5))
            sequencer.tracks[0].add(noteNumber: 79, velocity: 127, position: AKDuration(beats: 6), duration: AKDuration(beats: 0.5))
            sequencer.tracks[0].add(noteNumber: 69, velocity: 127, position: AKDuration(beats: 6.5), duration: AKDuration(beats: 0.5))
            sequencer.tracks[0].add(noteNumber: 78, velocity: 127, position: AKDuration(beats: 7), duration: AKDuration(beats: 0.5))
            sequencer.tracks[0].add(noteNumber: 69, velocity: 127, position: AKDuration(beats: 7.5), duration: AKDuration(beats: 0.5))
        }
    }


    // Here is my problem. I want to get the song and export it with the .wav extension exactly as it was played.
    // This code is just one of my many attempts and it also does not works.
    func saveFile() {
        guard let auxiliarPlayer = try? AKAudioPlayer(file: AKAudioFile(readFileName: "bass.wav")) else { return }
        guard let outputURL = try? FileManager.default.url(for: .documentDirectory, in: .userDomainMask, appropriateFor: nil, create: false).appendingPathComponent("audio_file.m4a") else { return }
        guard let output = try? AKAudioFile(forWriting: outputURL, settings: auxiliarPlayer.audioFile.fileFormat.settings) else { return }

        try? AudioKit.renderToFile(output, seconds: sequencer.tracks[0].length, prerender: {
            self.sequencer.play()
        })
    }
}

有人可以帮忙吗?

最佳答案

据我所知,由于计时问题,离线渲染不适用于 MIDI。基本上,只能渲染处理音频的东西,而不是依赖 MIDI 的东西。我相信这与 MIDI 时钟保持不变并且无法虚拟地“加速”以匹配音频渲染速度有关。

关于ios - 如何在 AudioKit 中渲染和导出音频?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/52012143/

相关文章:

ios - 如何删除tableView中的用户默认数据?

swift - FirebaseUI 不显示任何输入字段或按钮

bash - 如何检测当前正在 Linux 中播放的声音?

ios - 无法处理背景音频的远程控制事件

ios - opentok/tokbox iOS SDK 中流(发布者)的订阅者数量限制是多少?

ios - 未收到 GCM 推送通知

ios - UIScrollView 将内容推离页面

swift 初始化程序同时可选和非可选?

objective-c - 带有绑定(bind)的 Cocoa 音量 slider

ios - UICollectionView 的大小有时是错误的