I'm trying to stream the content of a html5 canvas
on a live basis using websockets
and nodejs
.
The content of the html5 canvas is just a video.
What I have done so far is:
I convert the canvas to blob
and then get the blob URL
and send that URL to my nodejs server using websockets.
I get the blob URL like this:
canvas.toBlob(function(blob) {
url = window.URL.createObjectURL(blob);
});
The blob URLs are generated per video frame (20 frames per second to be exact) and they look something like this:
blob:null/e3e8888e-98da-41aa-a3c0-8fe3f44frt53
I then get that blob URL back from the the server via websockets so I can use it to DRAW it onto another canvas for other users to see.
I did search how to draw onto canvas from blob URL but I couldn't find anything close to what i am trying to do.
So the questions I have are:
Is this the correct way of doing what i am trying to achieve? any pros and cons would be appreciated.
Is there any other more efficient way of doing this or I'm on a right path?
Thanks in advance.
EDIT:
I should have mentioned that I cannot use WebRTC in this project and I have to do it all with what I have.
to make it easier for everyone where I am at right now, this how I tried to display the blob URLs that I mentioned above in my canvas using websockets:
websocket.onopen = function(event) {
websocket.onmessage = function(evt) {
var val = evt.data;
console.log("new data "+val);
var canvas2 = document.querySelector('.canvMotion2');
var ctx2 = canvas2.getContext('2d');
var img = new Image();
img.onload = function(){
ctx2.drawImage(img, 0, 0)
}
img.src = val;
};
// Listen for socket closes
websocket.onclose = function(event) {
};
websocket.onerror = function(evt) {
};
};
The issue is that when I run that code in FireFox, the canvas is always empty/blank but I see the blob URLs in my console so that makes me think that what I am doing is wrong.
and in Google chrome, i get Not allowed to load local resource: blob:
error.
SECOND EDIT:
This is where I am at the moment.
First option
I tried to send the whole blob(s) via websockets and I managed that successfully. However, I couldn't read it back on the client side for some strange reason!
when I looked on my nodejs server's console, I could see something like this for each blob that I was sending to the server:
<buffer fd67676 hdsjuhsd8 sjhjs....
Second option:
So the option above failed and I thought of something else which is turning each canvas frame to base64(jpeg) and send that to the server via websockets and then display/draw those base64 image onto the canvas on the client side.
I'm sending 24 frames per second to the server.
This worked. BUT the client side canvas where these base64 images are being displayed again is very slow and and its like its drawing 1 frame per second. and this is the issue that i have at the moment.
Third option:
I also tried to use a video without a canvas. So, using WebRTC, I got the video Stream
as a single Blob. but I'm not entiely sure how to use that and send it to the client side so people can see it.
IMPORTANT: this system that I am working on is not a peer to peer connection. its just a one way streaming that I am trying to achieve.
I'm trying to stream the content of a html5 canvas
on a live basis using websockets
and nodejs
.
The content of the html5 canvas is just a video.
What I have done so far is:
I convert the canvas to blob
and then get the blob URL
and send that URL to my nodejs server using websockets.
I get the blob URL like this:
canvas.toBlob(function(blob) {
url = window.URL.createObjectURL(blob);
});
The blob URLs are generated per video frame (20 frames per second to be exact) and they look something like this:
blob:null/e3e8888e-98da-41aa-a3c0-8fe3f44frt53
I then get that blob URL back from the the server via websockets so I can use it to DRAW it onto another canvas for other users to see.
I did search how to draw onto canvas from blob URL but I couldn't find anything close to what i am trying to do.
So the questions I have are:
Is this the correct way of doing what i am trying to achieve? any pros and cons would be appreciated.
Is there any other more efficient way of doing this or I'm on a right path?
Thanks in advance.
EDIT:
I should have mentioned that I cannot use WebRTC in this project and I have to do it all with what I have.
to make it easier for everyone where I am at right now, this how I tried to display the blob URLs that I mentioned above in my canvas using websockets:
websocket.onopen = function(event) {
websocket.onmessage = function(evt) {
var val = evt.data;
console.log("new data "+val);
var canvas2 = document.querySelector('.canvMotion2');
var ctx2 = canvas2.getContext('2d');
var img = new Image();
img.onload = function(){
ctx2.drawImage(img, 0, 0)
}
img.src = val;
};
// Listen for socket closes
websocket.onclose = function(event) {
};
websocket.onerror = function(evt) {
};
};
The issue is that when I run that code in FireFox, the canvas is always empty/blank but I see the blob URLs in my console so that makes me think that what I am doing is wrong.
and in Google chrome, i get Not allowed to load local resource: blob:
error.
SECOND EDIT:
This is where I am at the moment.
First option
I tried to send the whole blob(s) via websockets and I managed that successfully. However, I couldn't read it back on the client side for some strange reason!
when I looked on my nodejs server's console, I could see something like this for each blob that I was sending to the server:
<buffer fd67676 hdsjuhsd8 sjhjs....
Second option:
So the option above failed and I thought of something else which is turning each canvas frame to base64(jpeg) and send that to the server via websockets and then display/draw those base64 image onto the canvas on the client side.
I'm sending 24 frames per second to the server.
This worked. BUT the client side canvas where these base64 images are being displayed again is very slow and and its like its drawing 1 frame per second. and this is the issue that i have at the moment.
Third option:
I also tried to use a video without a canvas. So, using WebRTC, I got the video Stream
as a single Blob. but I'm not entiely sure how to use that and send it to the client side so people can see it.
IMPORTANT: this system that I am working on is not a peer to peer connection. its just a one way streaming that I am trying to achieve.
Share Improve this question edited Sep 11, 2018 at 8:41 william wolly asked Sep 10, 2018 at 11:20 william wollywilliam wolly 3451 gold badge5 silver badges16 bronze badges 6- This is a perfect use case for WebRTC. Have a look at this example: webrtc.github.io/samples/src/content/capture/canvas-pc – Rob Commented Sep 10, 2018 at 11:31
- @Robert, unfrotunately I cannot use WebRTC in my project. – william wolly Commented Sep 10, 2018 at 11:33
- @williamwolly Then it would be worthwhile to include a plete question and include your constraints too before we get answers that are not relevant to your case. – Adrian Commented Sep 10, 2018 at 11:44
- @Adriani6 edited my question. – william wolly Commented Sep 10, 2018 at 11:59
- @PatrickRoberts, what do you mean by haveing direct access to the drawings on the canvas? It is a live streaming video that is being displayed on the canvas. – william wolly Commented Sep 10, 2018 at 16:46
1 Answer
Reset to default 8The most natural way to stream a canvas content: WebRTC
OP made it clear that they can't use it, and it may be the case for many because,
- Browser support is still not that great.
- It implies to have a MediaServer running (at least ICE+STUN/TURN, and maybe a gateway if you want to stream to more than one peer).
But still, if you can afford it, all you need then to get a MediaStream from your canvas element is
const canvas_stream = canvas.captureStream(minimumFrameRate);
and then you'd just have to add it to your RTCPeerConnection:
pc.addTrack(stream.getVideoTracks()[0], stream);
Example below will just display the MediaStream to a <video>
element.
let x = 0;
const ctx = canvas.getContext('2d');
draw();
startStream();
function startStream() {
// grab our MediaStream
const stream = canvas.captureStream(30);
// feed the <video>
vid.srcObject = stream;
vid.play();
}
function draw() {
x = (x + 1) % (canvas.width + 50);
ctx.fillStyle = 'white';
ctx.fillRect(0,0,canvas.width,canvas.height);
ctx.fillStyle = 'red';
ctx.beginPath();
ctx.arc(x - 25, 75, 25, 0, Math.PI*2);
ctx.fill();
requestAnimationFrame(draw);
}
video,canvas{border:1px solid}
<canvas id="canvas">75</canvas>
<video id="vid" controls></video>
The most efficient way to stream a live canvas drawing: stream the drawing operations.
Once again, OP said they didn't want this solution because their set-up doesn't match, but might be helpful for many readers:
Instead of sending the result of the canvas, simply send the drawing mands to your peers, which will then execute these on their side.
But this approach has its own caveats:
- You will have to write your own encoder/decoder to pass the mands.
- Some cases might get hard to share (e.g external media would have to be shared and preloaded the same way on all peers, and the worse case being drawing an other canvas, where you'd have to also have shared its own drawing process).
- You may want to avoid intensive image processing (e.g ImageData manipulation) to be done on all peers.
So a third, definitely less performant way to do it, is like OP tried to do:
Upload frames at regular interval.
I won't go in details in here, but keep in mind that you are sending standalone image files, and hence a whole lot more data than if it had been encoded as a video.
Instead, I'll focus on why OP's code didn't work?
First it may be good to have a small reminder of what is a Blob (the thing that is provided in the callback of canvas.toBlob(callback)
).
A Blob is a special JavaScript object, which represents binary data, generally stored either in browser's memory, or at least on user's disk, accessible by the browser.
This binary data is not directly available to JavaScript though. To be able to access it, we need to either read this Blob (through a FileReader or a Response object), or to create a BlobURI, which is a fake URI, allowing most APIs to point at the binary data just like if it was stored on a real server, even though the binary data is still just in the browser's allocated memory.
But this BlobURI being just a fake, temporary, and domain restricted path to the browser's memory, can not be shared to any other cross-domain document, application, and even less puter.
All this to say that what should have been sent to the WebSocket, are the Blobs directly, and not the BlobURIs.
You'd create the BlobURIs only on the consumers' side, so that they can load these images from the Blob's binary data that is now in their allocated memory.
Emitter side:
canvas.toBlob(blob=>ws.send(blob));
Consumer side:
ws.onmessage = function(evt) {
const blob = evt.data;
const url = URL.createObjectURL(blob);
img.src = url;
};
But actually, to even better answer OP's problem, a final solution, which is probably the best in this scenario,