For context, I’ve been following alongside on Apple’s documentation at. You may obtain the code there to breed this downside.
https://developer.apple.com/documentation/avfaudio/audio_engine/performing_offline_audio_processing?modifications=__6&language=objc.
Primarily, I’m making an attempt to rework a .wav file to a different .wav file utilizing AVAudioEngine.
Within the instance, I’m utilizing this as an alternative of Apple’s code. Simply utilizing a .wav file.
let sourceFile: AVAudioFile
let format: AVAudioFormat
do {
let sourceFileURL = Bundle.important.url(forResource: "totransform", withExtension: "wav")!
sourceFile = attempt AVAudioFile(forReading: sourceFileURL)
format = sourceFile.processingFormat
} catch {
fatalError("Unable to load the supply audio file: (error.localizedDescription).")
}
after which when I attempt to convert it, I’m utilizing this as an alternative of Apple’s code. Making an attempt to transform to a .wav file. That is the place I’m working into points.
let buffer = AVAudioPCMBuffer(pcmFormat: engine.manualRenderingFormat,
frameCapacity: engine.manualRenderingMaximumFrameCount)!
let outputFile: AVAudioFile
do {
let documentsURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0]
let outputURL = documentsURL.appendingPathComponent("output.wav")
outputFile = attempt AVAudioFile(forWriting: outputURL, settings: sourceFile.fileFormat.settings)
} catch {
fatalError("Unable to open output audio file: (error).")
}
Now, when I’ve “output.wav” the file can’t be performed. Nonetheless, if I take advantage of “output.caf” the whole lot works properly
My query is, how can I do offline rendering like within the Apple instance and have the output file be a “.wav”?
I’ve tried changing the buffers between codecs, however to no avail. I believe that is the path I have to take, however I’m not positive precisely how to do that.