Stream live stream using HTML5 Websockets?

The essence of the following.
Send live web cameras in real time to other participants of the conference
navigator.getUserMedia = navigator.getUserMedia ||
 navigator.webkitGetUserMedia ||
 navigator.mozGetUserMedia ||
navigator.msGetUserMedia;

var video = document.querySelector('video');

if (navigator.getUserMedia) {
 navigator.getUserMedia({audio: true, video: true}, function(stream) {
 video.src = window.URL.createObjectURL(stream);
 }, errorCallback);
} else {
 video.src = 'somevideo.webm'; // fallback.
}

Originally send data to Java server, and by Websocket, after from the server to all users.
And so the question as a stream to read the bytes of the stream to continue to send them to the server? Tried the FileReader but his methods readAs* at the input only accepts Blob types.
Any ideas? Maybe other implementation approaches?
Thank you
September 19th 19 at 13:13
2 answers
September 19th 19 at 13:15
Solution
The question is closed.
Can be implemented using WebRTC
Nevertheless, while that the flash is the dominant technology.
September 19th 19 at 13:17
Not very clear why do it via a websocket, but even if you imagine what volume you will be using them to send — as that will be.

You can try to look towards solutions https://github.com/phoboslab/jsmpegbut it requires nodejs and ffmpeg/VLC.
Thanks for the reply. With web sockets you can implement a stream in real time and manage all users in the session. And the number of traffic you can optimize and send superromance data, for example 256, 512, 1024 KB / s, the load is not so high, and then also implement the HTTP Protocol using "Chunk" but the length of the HTTP headers most (even some of the nuances) and traffic will be more. The server part doesn't matter Node.JS or Java, And I need to implement it in HTML5 (for cross-browser / of platformroot). You have a practice that it is better not to implement it all in websocket, or reason? - alexandrine.Gutmann40 commented on September 19th 19 at 13:20
I assume that happens in IT industryi, and that HTML5 video is highly dependent on the browser. If you can in Java without using ffmpeg to encode a video is a plus, of course, but it will not solve the problem of the video output in the video. The proposed solution uses the mpeg1 decoder and canvas works perfectly. - conor.Hilpert90 commented on September 19th 19 at 13:23
You complicate the problem in my opinion. Just I need a way to read / get the data from a stream of code which I wrote in the question allows you to capture video from webcam and show the item, why would I encoding / decoding to something on the server? The server was not interested in the video contents, it simply gets data from one client and send to others. Just can't find dokumentaciju which you can read bytes / dataURL of the stream, for the subsequent action. - alexandrine.Gutmann40 commented on September 19th 19 at 13:26
For example to give a FileReader stream and method .onload to tauferovy and send .result. Or if there is still best offer. - alexandrine.Gutmann40 commented on September 19th 19 at 13:29

Find more questions by tags HTMLProgrammingJavaScript