Sending a MediaStream to host Server with WebRTC after it is captured by getUserMedia

JavascriptJqueryAudioWebrtcGetusermedia

Javascript Problem Overview


I am capturing audio data using getUserMedia() and I want to send it to my server so I can save it as a Blob in a MySQL field.

This is all I am trying to do. I have made several attempts to do this using WebRTC, but I don't even know at this point if this is right or even the best way to do this.

Can anybody help me?

Here is the code I am using to capture audio from the microphone:

navigator.getUserMedia({
	video:false,
	audio:true,
},function(mediaStream){
	
	// output mediaStream to speakers:
	var mediaStreamSource=audioContext.createMediaStreamSource(mediaStream);
	mediaStreamSource.connect(audioContext.destintion);
	
	// send mediaStream to server:
	
    // WebRTC code? not sure about this...
    var RTCconfig={};
    var conn=new RTCPeerConnection(RTCconfig);
    
	// ???
	
},function(error){
	console.log('getUserMedia() fail.');
	console.log(error);
});

How can I send this mediaStream up to the server?

After Googling around I've been looking into WebRTC, but this seems to be for just peer to peer communication - actually, now I'm looking into this more, I think this is the way to go. It seems to be the way to communicate from the client's browser up to the host webserver, but nothing I try even comes close to working.

I've been going through the W3C documentation (which I am finding way too abstract), and I've been going thru this article on HTML5 Rocks (which is bringing up more questions than answers). Apparently I need a signalling method, can anyone advise which signalling method is best for sending mediaStreams, XHR, XMPP, SIP, Socket.io or something else?

What will I need on the server to support the receiving of WebRTC? My web server is running a basic LAMP stack.

Also, is it best to wait until the mediaStream is finished recording before I send it up to the server, or is it better to send the mediaStream as its being recorded? I want to know if I am going about doing this the right way. I have written file uploaders in javascript and HTML5, but uploading one of these mediaStreams seems hellishly more complicated and I'm not sure if I am approaching it right.

Any help on this would be greatly appreciated.

Javascript Solutions


Solution 1 - Javascript

You cannot upload the live stream itself while it is running. This is because it is a LIVE stream.

So, this leaves you with a handful options.

  1. Record the audio stream using one of the many recorders out there RecordRTC works fairly well. Wait until the stream is completed and then upload the file.
  2. Send smaller chuncks of recorded audio with a timer and merge them again server side. This is an example of this
  3. Send the audio packets as they occur over websockets to your server so that you can manipulate and merge them there. My version of RecordRTC does this.
  4. Make an actual peer connection with your server so it can grab the raw rtp stream and you can record the stream using some lower level code. This can easily be done with the Janus-Gateway.

As for waiting to send the stream vs sending it in chunks, it all depends on how long you are recording. If it is for a longer period of time, I would say sending the recording in chunks or actively sending audio packets over websockets is a better solution as uploading and storing larger audio files from the client side can be arduous for the client.

Firefox actually has a its own solution for recording but it is not supported in chrome so it may not work in your situation.

As an aside, the signalling method mentioned is for session build/destroy and really has nothing to do with the media itself. You would only really worry about this if you were using possibly solution number 4 shown above.

Solution 2 - Javascript

A good API for you would be MediaRecorder API but it is less supported than the Web Audio API, so you can do it using a ScriptNode or use Recorder.js (or base on it to build your own scriptnode).

Solution 3 - Javascript

WebRTC is design as peer-to-peer, but the peer could be a browser and a server. So it's definitely possible to push the stream by WebRTC to a server, then record the stream as a file.

The stream flow is:

Chrome ----WebRTC--->   Server  ---record---> FLV/MP4

There are lots of servers, like SRS, janus or mediasoup to accept WebRTC stream. Please note that you might need to covert the WebRTC(H.264+Opus) to MP4(H.264+AAC), or just choose SRS which supports this feature.

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionJimmeryView Question on Stackoverflow
Solution 1 - JavascriptBenjamin TrentView Answer on Stackoverflow
Solution 2 - JavascriptLuizgrsView Answer on Stackoverflow
Solution 3 - JavascriptWinlinView Answer on Stackoverflow