I'm recording some audio data in my javascript client using RecordRTC
. I want to send this audio data to my Spring RestController
via WebSockets
.
After recording I have a blob object in my javascript client: Blob {size: 65859, type: "audio/webm"}
I tried to convert this blob to an ArrayBuffer object using FileReader
which looks like this ArrayBuffer {} byteLength: 65859
My javascript code where I send the ArrayBuffer looks like this:
const reader = new FileReader();
reader.addEventListener( 'loadend', () => {
console.log( 'readerResult', reader.result );
this.stompClient.send( "/app/hello", {}, reader.result );
} );
this.recorder.stopRecording(() => {
const blob = this.recorder.getBlob();
reader.readAsArrayBuffer( blob );
console.log( "blob", blob );
} );
My WebSocket endpoint in my Spring Boot application looks like this:
@MessageMapping("/hello")
public void stream(byte[] input) throws Exception {
System.out.println("ining message ...");
System.out.println(input);
System.out.println(input.length);
}
And this is the console output:
ining message ...
[B@6142acb4
20
The bytearray in my server contains only 20 bytes, which looks like only metadata are transfered?
How can I transfer the recorded Blob to my Spring server and create a (webm) file of it? Do I have to change the parameter of my endpoint?
I'm recording some audio data in my javascript client using RecordRTC
. I want to send this audio data to my Spring RestController
via WebSockets
.
After recording I have a blob object in my javascript client: Blob {size: 65859, type: "audio/webm"}
I tried to convert this blob to an ArrayBuffer object using FileReader
which looks like this ArrayBuffer {} byteLength: 65859
My javascript code where I send the ArrayBuffer looks like this:
const reader = new FileReader();
reader.addEventListener( 'loadend', () => {
console.log( 'readerResult', reader.result );
this.stompClient.send( "/app/hello", {}, reader.result );
} );
this.recorder.stopRecording(() => {
const blob = this.recorder.getBlob();
reader.readAsArrayBuffer( blob );
console.log( "blob", blob );
} );
My WebSocket endpoint in my Spring Boot application looks like this:
@MessageMapping("/hello")
public void stream(byte[] input) throws Exception {
System.out.println("ining message ...");
System.out.println(input);
System.out.println(input.length);
}
And this is the console output:
ining message ...
[B@6142acb4
20
The bytearray in my server contains only 20 bytes, which looks like only metadata are transfered?
How can I transfer the recorded Blob to my Spring server and create a (webm) file of it? Do I have to change the parameter of my endpoint?
Share Improve this question edited Aug 13, 2017 at 18:33 Ajit Soman 4,0843 gold badges25 silver badges42 bronze badges asked Aug 13, 2017 at 7:06 Peter LustigPeter Lustig 1,7011 gold badge22 silver badges37 bronze badges 3-
Where do you call
.readAsArrayBuffer()
? – guest271314 Commented Aug 13, 2017 at 7:11 - @guest271314 in my stopRecording() method when I get the blob from the recorder. I added the snippet to my post – Peter Lustig Commented Aug 13, 2017 at 7:14
-
What is expected result of
System.out.println(input)
? – guest271314 Commented Aug 13, 2017 at 14:50
1 Answer
Reset to default 14I would suggest you to create base64 string for your audio blob using below code:
var reader = new FileReader();
var base64data;
reader.readAsDataURL(blob);
reader.onloadend = function() {
base64data = reader.result;
console.log(base64data);
}
You will get base64 string like this :
data:audio/webm;base64,T2dnUwACAAAAAAAAAAAyzN3N....
Now in your backend change your stream method code to this:
@MessageMapping("/hello")
public void stream(String base64Audio) throws Exception {
System.out.println("ining message ...");
Decoder decoder = Base64.getDecoder();
byte[] decodedByte = decoder.decode(base64Audio.split(",")[1]);
FileOutputStream fos = new FileOutputStream("MyAudio.webm");
fos.write(decodedByte);
fos.close();
}