Stream live WebM video to browser using Node.js and GStreamer


In this post, we’ll stream live WebM video to the browser using just GStreamer and Node.js. In a previous post we did it using Flumotion. Follow the procedure mentioned in that post to setup GStreamer 0.10.32 (or later). We’ll use Node.js with the express middleware. We have used that previously to do on-demand streaming of a WebM file.

Code

We spawn a GStreamer pipeline to mux a WebM stream, and stream it to the TCP client sockets using tcpserversink element. We receive a request from the browser at port 8001, create a TCP client socket to listen to a WebM stream, and stream all data received from that socket to the browser.  The code follows

Execute

Assuming you have saved the script above to a file called script.js, run Node.js thus:

node script.js

Now, you can play the WebM stream in Chrome by accessing
http://localhost:8001/.

Debug

If you want to trace all system calls, especially if you change the args to GStreamer and get a cryptic message like

execvp(): No such file or directory

You can execute Node.js with strace

strace -fF -o strace.log node livewebm.js

Video and audio source elements

Here’s a list of alternative video source elements

  1. autovideosrc – automatically detects and chooses a video source
  2. ksvideosrc – video capture from cameras on Windows
  3. v4l2src – obtains video stream from a Video 4 Linux 2 device, such as a webcam
  4. ximagesrc – video stream is produced from screenshots

Here’s a list of alternative audio source elements

  1. autoaudiosrc – automatically detects and chooses an audio source
  2. alsasrc – captures audio stream from a specific device using alsa
  3. pulsesrc – captures audio stream from the default mic, based on system settings

An important point to note is that all these sources are live sources. GStreamer defines live sources as sources that discard data when paused, and produce data at a fixed rate thus providing a clock to publish this rate.

Advertisements

35 thoughts on “Stream live WebM video to browser using Node.js and GStreamer

  1. Hi there.

    Where does the video source comes from? I understand that when GET from browser comes in a live data (webM) is being streamed back to the browser. But where is that live data stream?

  2. Hi. The live stream is established in the callback function registered with server.listen. I start listening for the stream and then execute the pipeline (spawn gst-launch) that will stream me the data. There are several elements that act like live sources [1 2]: v4l2src – captures from webcam, ximagesrc – captures screen, souphttpsrc, videotestsrc etc

  3. This works for me in Chrome, but not in any of the other browsers I’ve tried. Do you know if other browsers are going to support this, or perhaps another mechanism like Websockets or the HTML 5 video tag is better for portability?

    1. The HTML 5 video tag was the target of this demo. I’ve not kept up with WebM streaming support in the latest browser versions. Chrome being from Google certainly should have stellar support. During my tests a year back I was able to use the video tag in Firefox and Opera. I don’t think I tried IE though.

  4. i got this error…!
    can any one clarify it….!

    module.js:340
    throw err;
    ^
    Error: Cannot find module ‘express’
    at Function.Module._resolveFilename (module.js:338:15)
    at Function.Module._load (module.js:280:25)
    at Module.require (module.js:362:17)
    at require (module.js:378:17)
    at Object. (/home/sandeep/node-v0.8.11-linux-x86/bin/test.js:1:78)
    at Module._compile (module.js:449:26)
    at Object.Module._extensions..js (module.js:467:10)
    at Module.load (module.js:356:32)
    at Function.Module._load (module.js:312:12)
    at Module.runMain (module.js:492:10)

  5. Is that feasible on Windows? I hardly tried for 3 day so far to find windows binaries that would work with this but haven’t succeed yet, I can even build my own, because I don’t have VS 2006 which is the only available windows platform that compiles the source from official gstream site as far as I can understand. Thanks

    1. It is feasible now that GStreamer has an SDK [http://gstreamer.com/] for Windows. The SDK does not support tcpclientsink though, so you’ll have to get the encoded video stream through other means, udpsink is a viable replacement.

  6. Thank you, ksvideosrc solved the problem on Windows. But I have still some troubles. I work on video conference application and I need to stream my web camera, so I use ‘ksvideosrc’ for video source, like this: gsp-launch-0.10 ksvideosrc device-index=0 ! ffmpegcolorspace ! vp8enc speed=2 ! queue2. The problem is that I receive my video with 10 seconds delay which is unacceptable for video chat. Do you think this problem may be resolved?

    1. The lag you see is mainly due to delay in setup and buffering in the browser. The lag also keeps increasing when video is interrupted due to software or networking delays. Unfortunately it is not possible to resolve it. You’ll have to use something like http://www.webrtc.org/ for more robust video conferencing in the browser.

  7. I have a problem.Can u help me fix it?
    GStreamer error, exit code 1
    WARNING: erroneous pipeline: no element “vp8enc”

  8. Can you please tell me how to get low latency video streaming using node.js, gstreamer and html5 video tag. I am getting 4-5 seconds delay using your code.

    1. Hi Mohit, there is not much that can be done. I have commented above regarding the delay added by the video tag and the network. This may be acceptable for one-way broadcasting to the browser, but not for two-way live communication. Even for one-way broadcasting, live adaptive streaming is probably a better approach. For two-way communication, webrtc is an approach that works best. The approach I take works for simple remote video surveillance and similar applications.

  9. Hi Devendra,

    I’m working on a one-way broadcasting project using Raspberry Pi and GStreamer. I can stream 1080p@25fps from pi to OSX both using get-launch command in terminal now.

    But it doesn’t work when it comes to this approach. I’m using raspivid command to retrieve live video from Raspberry Pi Camera, how could I modify your code to do this? (I’m totally new with nodejs)

    Here’s the terminal command:

    raspivid -t 0 -h 720 -w 1080 -fps 25 -hf -b 2000000 -o – | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=1 pt=96 ! gdppay ! tcpclientsink host=0.0.0.0 port=5000

    I’ve tried this but with no success:

    var cmd=’raspivid -t 0 -h 720 -w 1080 -fps 25 -hf -b 2000000 -o – | gst-launch-1.0′;
    then put everything else into args array.

    It just raise errors when browser visit the address. And shows console.debug no method error.

    Would you please give me some hint?

    Thank you.

    1. Hi. One problem I see with your pipeline (if you are using it with the Node.js script) is that you are trying to stream RTP payload. The script expects to stream live WebM to the video tag in the browser. This is specified in the Content-Type HTTP header sent to the browser. Video tag (AFAIK) does not currently support RTP.

      1. Thank you for reply.

        So what you think the best way to wirelessly stream live video from pi to iOS/Android?

        I’ve tried raspistill + MJPEG over HTTP but with only 1~3 fps, that’s too slow (because it continuously writes JPG file into SD card and read it out). The best performance as far as I tried is the command I mentioned above, it achieved 720p@30fps with less than 1 sec latency while only took 25% pi CPU (for 1 connection, without over clock), 2.3Mbps network traffic and almost no memory (I think because most process is done in GPU).

        But I don’t understand why package rtp in gdp over tcp. Since mobile browsers now decode h264, is there an easy way to stream h264 video directly to http then client can view it simply open their browser? Or the next-best if the latency is too long in http, how to pipe h264 to rtsp maybe?

        I’m really stuck, would you please give me some help?

        Thank you.

  10. Here’s a pipeline that works with GStreamer 1.0:

        var args =
          ['videotestsrc', 'horizontal-speed=1', 'is-live=1',
          '!', 'video/x-raw,format=\(string\)RGB,framerate=30/1',
          '!', 'videoconvert',
          '!', 'vp8enc', 'cpu-used=0',
          '!', 'queue2',
          '!', 'm.', 'audiotestsrc', 'is-live=1',
          '!', 'audioconvert',
          '!', 'vorbisenc',
          '!', 'queue2',
          '!', 'm.', 'webmmux', 'name=m', 'streamable=true',
          '!', 'tcpclientsink', 'host=localhost',
          'port='+tcpServer.address().port];
    
  11. hi can you help me with this error below?
    “options” seems to be always null but i can’t launch your example

    TypeError: “options” argument must be an object
    at normalizeSpawnArguments (child_process.js:321:11)
    at Object.exports.spawn (child_process.js:366:38)
    at Server. (../script.js:48:26)
    at Server.g (events.js:286:16)
    at emitNone (events.js:86:13)
    at Server.emit (events.js:185:7)
    at emitListeningNT (net.js:1279:10)
    at _combinedTickCallback (internal/process/next_tick.js:71:11)
    at process._tickCallback (internal/process/next_tick.js:98:9)

  12. Thanks for this post. I got this to work with H264 encoded MP4 feeds from a webcam that I thought I’d share. These feeds open and play in Firefox 50.1.10.

    These arguments will start the MP4 stream from a test source and should work on all implementations.

    var cmd = 'gst-launch-1.0';
    var args =
          ['-v', 'videotestsrc', 'horizontal-speed=1', 'is-live=true',
          '!', 'video/x-raw,format=(string)RGB,framerate=30/1',
          '!', 'videoconvert',
          '!', 'x264enc',
          '!', 'mp4mux', 'streamable=true', 'fragment-duration=1000',
          '!', 'tcpclientsink', 'host=localhost',
          'port='+tcpServer.address().port];
    

    These arguments will stream from a Logitech c920 webcam directly from it’s built in H264 encoder.

    // working test with h264 & webcam
           var args =
          ['-v', 'v4l2src', 'device=/dev/video0',
          '!', 'video/x-h264,stream-format=byte-stream,width=800,height=600,framerate=30/1',
          '!', 'h264parse',
          '!', 'mp4mux', 'streamable=true', 'fragment-duration=1000',
          '!', 'tcpclientsink', 'host=localhost',
          'port='+tcpServer.address().port];
    

    Now I am trying to figure out how I could get this to work to multiple clients.

      1. I’m not seeing how you would do this with multiple streams. Is your thought to create a new TCP sink for each new client and to start forwarding that sink to the new clients response to each new client?

        I’ve found that the MP4 stream in the way I put above (similar to your webm stream) is the only way to get a live feed to play in the video HTML tag. For add additional clients, I tried to simply put additional res objects in an array and for sending in a for loop in the socket.on(‘data’, function). The data is sent but only the first client that connects is able to view the video. I suspect the issue is (1) there is header information at the beginning of the MP4 stream not being sent (2) those TCP packets aren’t synchronized with the additional sources.

        I tried rtp streams with rtph264pay and rtpmp4vpay and commands like

        $gst-launch-1.0 -e -v v4l2src device=/dev/video0 ! video/x-h264,stream-format=byte-stream,width=800,height=600,framerate=30/1 ! h264parse !  rtph264pay config-interval=1 pt=96 ! udpsink host=127.0.0.1 port=4006rtph264pay
        

        These udp packets can then be distributed the by a node server. These distributed streams are viewable with vlc player but require using a .sdp file that contains some parameters, ‘sprop-parameter-sets’ is needed for the h264 stream and ‘config’ needed for the mp4.

        Serving the SDP file with an app.get(‘/stream.sdp’, doesn’t work as a solution either. It will load quicktime if the url is put into the firefox address line. If put in the src field of the HTML video tag, firefox will say incompatible mime type. It needs to be in a HTML tag so a socket could be added to tell if the client connection has closed.

        I tried serving the rtpmp4vpay stream from upd similar to how you did it in the example. This stream isn’t viewable picked up live (probably because of the needed ‘config’ parameters). This leaves me back at trying to serve mp4mux similar to your example. I don’t see how to get this to go to multiple clients.

        My other idea is just to do an mpeg-streamer or the node equivalent. https://www.npmjs.com/package/mjpeg-streamer for multiple clients with browser compatibility. That works by replying to the mpeg get with a http 200 multipart/x-mixed-replace response followed by Content-Type: image/jpeg reponses with each image on same stream. These streams are old http://www.fishcam.com/ (like netscape) thus have wide browser compatibility but they also use more bandwidth.

    1. I’ve changed the GStreamer pipeline used in the post. Instead of using tcpclientsink, I’m now using tcpserversink. That allows multiple clients to connect to the same stream.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s