I was trying to record audio and get PCM stream data in iOS. I’m using AvAudioEngine for this, and everything is well set up I think.
When receive PCM data, I convert it to base64 String, and send it to Cloud to show the audio Text.
But it’s not working.
By default my set up is fixed like : Sample rate 16000 Channel 1 Chunks/buffersize 1000 Audio format ( ENCODING_PCM 16 bit )
It’s generating base64, but web can't convert my base64 to audio text. My question is if there any error in my code ?
And how can I assure this buffer -> base64 string conversion is correct?
func setUp() {
let inputNode = self.engine.inputNode
inputNode.removeTap(onBus: 0)
inputFormat = inputNode.outputFormat(forBus: outputBus)
inputNode.installTap(onBus: inputBus, bufferSize: bufferSize, format: inputFormat, block: { [weak self] (buffer, _) in
print("[PP][PluginAudioEngine] installTap")
self?.processBufferData(buffer)
})
engine.prepare()
}
// MARK: - PCM Processing, way 2
private func processBufferData(_ buffer: AVAudioPCMBuffer) {
processingQueue.async() {
guard let pcmData = self.convertToPCM16(buffer: buffer) else { return }
var offset = 0
while offset < pcmData.count {
let chunkEnd = min(offset + Int(self.bufferSize), pcmData.count)
let chunk = pcmData.subdata(in: offset..<chunkEnd)
DispatchQueue.main.async {
let encodedString = chunk.base64EncodedString()
print("encodedString:\n \(encodedString) \n") /// Test log
self.audioRecordStream.value = encodedString
}
offset += Int(self.bufferSize)
}
}
}
// MARK: - PCM Conversion
private func convertToPCM16(buffer: AVAudioPCMBuffer) -> Data? {
guard let floatChannelData = buffer.floatChannelData else { return nil }
let frameLength = Int(buffer.frameLength)
var pcm16Data = Data(capacity: frameLength * 2) // 16-bit PCM (2 bytes per sample)
for frame in 0..<frameLength {
let sample = floatChannelData.pointee[frame]
let int16Sample = Int16(sample * Float(Int16.max)) // Convert Float32 → Int16
withUnsafeBytes(of: int16Sample) { pcm16Data.append(contentsOf: $0) }
}
return pcm16Data
}