I am trying to write a code in C#, using NAudio library. This consists on recording the audio from microphone and sending it to a cloud API. The API method receives a MemoryStream. I can successfully make it working when loading the recorded file into a new MemoryStream.
var recResult = speechToText.Recognize(
audio: new MemoryStream(File.ReadAllBytes("audio-file.wav")),
model: "pt-BR_Multimedia",
contentType: "audio/wav");
What I am trying now is to avoid saving the WAV file and send the MemoryStream directly from the recording process.
As a C# (very) beginner dev, I am using the great samples from NAudio library and as an attempt, I tried to save the recording bytes directly to a MemoryStream as below:
writer is a WaveFileWriter object; memStream is a MemoryStream object
void OnDataAvailable(object sender, WaveInEventArgs e)
{
if (InvokeRequired)
{
//Debug.WriteLine("Data Available");
BeginInvoke(new EventHandler<WaveInEventArgs>(OnDataAvailable), sender, e);
}
else
{
//this is my new MemoryStream object
memStream.Write(e.Buffer, 0, e.BytesRecorded);
//this is the reguler working writer object
writer.Write(e.Buffer, 0, e.BytesRecorded);
int secondsRecorded = (int)(writer.Length / writer.WaveFormat.AverageBytesPerSecond);
if (secondsRecorded >= 5)
{
StopRecording();
}
else
{
progressBar1.Value = secondsRecorded;
}
}
}
While this compiles and runs fine, the MemoryStream isn´t accepted by the API which results in a runtime error ( bad request ).
I can only guess my MemoryStream doesn´t really have the WAVE format in.
Any suggestion on how I can make sure my MemoryStream does represent a WAV audio content ?
Thank you, Márcio
I am trying to write a code in C#, using NAudio library. This consists on recording the audio from microphone and sending it to a cloud API. The API method receives a MemoryStream. I can successfully make it working when loading the recorded file into a new MemoryStream.
var recResult = speechToText.Recognize(
audio: new MemoryStream(File.ReadAllBytes("audio-file.wav")),
model: "pt-BR_Multimedia",
contentType: "audio/wav");
What I am trying now is to avoid saving the WAV file and send the MemoryStream directly from the recording process.
As a C# (very) beginner dev, I am using the great samples from NAudio library and as an attempt, I tried to save the recording bytes directly to a MemoryStream as below:
writer is a WaveFileWriter object; memStream is a MemoryStream object
void OnDataAvailable(object sender, WaveInEventArgs e)
{
if (InvokeRequired)
{
//Debug.WriteLine("Data Available");
BeginInvoke(new EventHandler<WaveInEventArgs>(OnDataAvailable), sender, e);
}
else
{
//this is my new MemoryStream object
memStream.Write(e.Buffer, 0, e.BytesRecorded);
//this is the reguler working writer object
writer.Write(e.Buffer, 0, e.BytesRecorded);
int secondsRecorded = (int)(writer.Length / writer.WaveFormat.AverageBytesPerSecond);
if (secondsRecorded >= 5)
{
StopRecording();
}
else
{
progressBar1.Value = secondsRecorded;
}
}
}
While this compiles and runs fine, the MemoryStream isn´t accepted by the API which results in a runtime error ( bad request ).
I can only guess my MemoryStream doesn´t really have the WAVE format in.
Any suggestion on how I can make sure my MemoryStream does represent a WAV audio content ?
Thank you, Márcio
Share Improve this question edited Mar 17 at 13:41 user23633404 asked Mar 15 at 17:34 Marcio CorreaMarcio Correa 494 bronze badges 3 |1 Answer
Reset to default 2The answer/solution to this question is to use a WaveFileWriter and save the bytes directly to a MemoryStream rather a file. The WaveFileWritter wraps the raw bytes[] as a formatted Wave stream. Change rate and channels as needed.
writerMem = new WaveFileWriter(memStream, new WaveFormat(44100, 16, 1));
Write read bytes to the WaveFileWriter:
writerMem.Write(e.Buffer, 0, e.BytesRecorded);
Call the API passing the Memory Stream only
var recResult = speechToText.Recognize(
audio: memStream,
model: "pt-BR_Multimedia",
contentType: "audio/wav");
This way, the API accepts the MemoryStream and identify the WAVE stream from within.
WaveFileWriter
and write to that only, not the stream.writer = new WaveFileWriter(memStream, new Wave format(44100, 16, 1));
- Change the sampling rate and number of channels as needed. – Visual Vincent Commented Mar 16 at 8:19