Stream live WebM video to browser using Node.js and GStreamer


In this post, we’ll stream live WebM video to the browser using just GStreamer and Node.js. In a previous post we did it using Flumotion. Follow the procedure mentioned in that post to setup GStreamer 0.10.32 (or later). We’ll use Node.js with the express middleware. We have used that previously to do on-demand streaming of a WebM file.

Code

We receive a request from the browser at port 9001, create a TCP server socket to listen to a WebM stream, and stream all data received from that socket to the browser. We then spawn a GStreamer pipeline to mux a WebM stream, and stream it to the TCP server socket using the tcpclientsink element. The code follows:

var express = require('express')
var http = require('http')
var net = require('net');
var child = require('child_process');

var app = express();
var httpServer = http.createServer(app);

app.get('/', function(req, res) {
  var date = new Date();

  res.writeHead(200, {
    'Date':date.toUTCString(),
    'Connection':'close',
    'Cache-Control':'private',
    'Content-Type':'video/webm',
    'Server':'CustomStreamer/0.0.1',
  });

  var tcpServer = net.createServer(function (socket) {
    socket.on('data', function (data) {
      res.write(data);
    });
    socket.on('close', function(had_error) {
      res.end();
    });
  });

  tcpServer.maxConnections = 1;

  tcpServer.listen(function() {
    var cmd = 'gst-launch-0.10';
    var args =
      ['videotestsrc', 'horizontal-speed=1', 'is-live=1',
      '!', 'video/x-raw-rgb,framerate=30/1',
      '!', 'ffmpegcolorspace',
      '!', 'vp8enc', 'speed=2',
      '!', 'queue2',
      '!', 'm.', 'audiotestsrc', 'is-live=1',
      '!', 'audioconvert',
      '!', 'vorbisenc',
      '!', 'queue2',
      '!', 'm.', 'webmmux', 'name=m', 'streamable=true',
      '!', 'tcpclientsink', 'host=localhost',
      'port='+tcpServer.address().port];

    var gstMuxer = child.spawn(cmd, args);

    gstMuxer.stderr.on('data', onSpawnError);
    gstMuxer.on('exit', onSpawnExit);

    res.connection.on('close', function() {
      gstMuxer.kill();
    });
  });
});

httpServer.listen(9001);

function onSpawnError(data) {
  console.log(data.toString());
}

function onSpawnExit(code) {
  if (code != null) {
    console.error('GStreamer error, exit code ' + code);
  }
}

process.on('uncaughtException', function(err) {
  console.debug(err);
});

Execute

Assuming you have saved the script above to a file called script.js, run Node.js thus:

node script.js

Now, you can play the WebM stream in Chrome by accessing the following url: http://localhost:9001/

Debug

If you want to trace all system calls, especially if you change the args to GStreamer and get a cryptic message like:

execvp(): No such file or directory

you can execute Node.js with strace:

strace -fF -o strace.log node livewebm.js

Video and audio source elements

The GStreamer pipeline spawned above uses test video and audio source elements. You’ll need to obtain video and audio streams from real devices for any practical application.

Here’s a list of alternative video source elements:

  1. autovideosrc – automatically detects and chooses a video source
  2. v4l2src – obtains video stream from a Video 4 Linux 2 device, such as a webcam
  3. ximagesrc – video stream is produced from screenshots

Here’s a list of alternative audio source elements:

  1. autoaudiosrc – automatically detects and chooses an audio source
  2. alsasrc – captures audio stream from a specific device using alsa
  3. pulsesrc – captures audio stream from the default mic, based on system settings

An important point to note is that all these sources are live sources. GStreamer defines live sources as sources that discard data when paused, and produce data at a fixed rate thus providing a clock to publish this rate.

Limitations

The example above can support multiple viewers only because we use test video and audio streams. GStreamer pipelines cannot simultaneously capture streams using sources that access the same device, hence tcpServer.maxConnections has been restricted to 1. Even assuming that it can be done, the code above is CPU intensive, since audio and video encoding is done once per viewer.

30 thoughts on “Stream live WebM video to browser using Node.js and GStreamer

  1. Hi there.

    Where does the video source comes from? I understand that when GET from browser comes in a live data (webM) is being streamed back to the browser. But where is that live data stream?

  2. Hi. The live stream is established in the callback function registered with server.listen. I start listening for the stream and then execute the pipeline (spawn gst-launch) that will stream me the data. There are several elements that act like live sources [1 2]: v4l2src – captures from webcam, ximagesrc – captures screen, souphttpsrc, videotestsrc etc

  3. This works for me in Chrome, but not in any of the other browsers I’ve tried. Do you know if other browsers are going to support this, or perhaps another mechanism like Websockets or the HTML 5 video tag is better for portability?

    1. The HTML 5 video tag was the target of this demo. I’ve not kept up with WebM streaming support in the latest browser versions. Chrome being from Google certainly should have stellar support. During my tests a year back I was able to use the video tag in Firefox and Opera. I don’t think I tried IE though.

  4. i got this error…!
    can any one clarify it….!

    module.js:340
    throw err;
    ^
    Error: Cannot find module ‘express’
    at Function.Module._resolveFilename (module.js:338:15)
    at Function.Module._load (module.js:280:25)
    at Module.require (module.js:362:17)
    at require (module.js:378:17)
    at Object. (/home/sandeep/node-v0.8.11-linux-x86/bin/test.js:1:78)
    at Module._compile (module.js:449:26)
    at Object.Module._extensions..js (module.js:467:10)
    at Module.load (module.js:356:32)
    at Function.Module._load (module.js:312:12)
    at Module.runMain (module.js:492:10)

  5. Is that feasible on Windows? I hardly tried for 3 day so far to find windows binaries that would work with this but haven’t succeed yet, I can even build my own, because I don’t have VS 2006 which is the only available windows platform that compiles the source from official gstream site as far as I can understand. Thanks

  6. Thank you, this article [https://delog.wordpress.com/2013/01/05/gstreamer-sdk-for-windows/] solved the problem. But I have still some troubles. I work on video conference application and I need to stream my web camera, so I use ‘ksvideosrc’ for video source, like this: gsp-launch-0.10 ksvideosrc device-index=0 ! ffmpegcolorspace ! vp8enc speed=2 ! queue2. The problem is that I receive my video with 10 seconds delay which is unacceptable for video chat. Do you think this problem may be resolved?

    1. The lag you see is mainly due to delay in setup and buffering in the browser. The lag also keeps increasing when video is interrupted due to software or networking delays. Unfortunately it is not possible to resolve it. You’ll have to use something like http://www.webrtc.org/ for more robust video conferencing in the browser.

  7. I have a problem.Can u help me fix it?
    GStreamer error, exit code 1
    WARNING: erroneous pipeline: no element “vp8enc”

  8. Can you please tell me how to get low latency video streaming using node.js, gstreamer and html5 video tag. I am getting 4-5 seconds delay using your code.

    1. Hi Mohit, there is not much that can be done. I have commented above regarding the delay added by the video tag and the network. This may be acceptable for one-way broadcasting to the browser, but not for two-way live communication. Even for one-way broadcasting, live adaptive streaming is probably a better approach. For two-way communication, webrtc is an approach that works best. The approach I take works for simple remote video surveillance and similar applications.

  9. Hi Devendra,

    I’m working on a one-way broadcasting project using Raspberry Pi and GStreamer. I can stream 1080p@25fps from pi to OSX both using get-launch command in terminal now.

    But it doesn’t work when it comes to this approach. I’m using raspivid command to retrieve live video from Raspberry Pi Camera, how could I modify your code to do this? (I’m totally new with nodejs)

    Here’s the terminal command:

    raspivid -t 0 -h 720 -w 1080 -fps 25 -hf -b 2000000 -o – | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=1 pt=96 ! gdppay ! tcpclientsink host=0.0.0.0 port=5000

    I’ve tried this but with no success:

    var cmd=’raspivid -t 0 -h 720 -w 1080 -fps 25 -hf -b 2000000 -o – | gst-launch-1.0′;
    then put everything else into args array.

    It just raise errors when browser visit the address. And shows console.debug no method error.

    Would you please give me some hint?

    Thank you.

    1. Hi. One problem I see with your pipeline (if you are using it with the Node.js script) is that you are trying to stream RTP payload. The script expects to stream live WebM to the video tag in the browser. This is specified in the Content-Type HTTP header sent to the browser. Video tag (AFAIK) does not currently support RTP.

      1. Thank you for reply.

        So what you think the best way to wirelessly stream live video from pi to iOS/Android?

        I’ve tried raspistill + MJPEG over HTTP but with only 1~3 fps, that’s too slow (because it continuously writes JPG file into SD card and read it out). The best performance as far as I tried is the command I mentioned above, it achieved 720p@30fps with less than 1 sec latency while only took 25% pi CPU (for 1 connection, without over clock), 2.3Mbps network traffic and almost no memory (I think because most process is done in GPU).

        But I don’t understand why package rtp in gdp over tcp. Since mobile browsers now decode h264, is there an easy way to stream h264 video directly to http then client can view it simply open their browser? Or the next-best if the latency is too long in http, how to pipe h264 to rtsp maybe?

        I’m really stuck, would you please give me some help?

        Thank you.

  10. Here’s a pipeline that works with GStreamer 1.0:

        var args =
          ['videotestsrc', 'horizontal-speed=1', 'is-live=1',
          '!', 'video/x-raw,format=\(string\)RGB,framerate=30/1',
          '!', 'videoconvert',
          '!', 'vp8enc', 'cpu-used=0',
          '!', 'queue2',
          '!', 'm.', 'audiotestsrc', 'is-live=1',
          '!', 'audioconvert',
          '!', 'vorbisenc',
          '!', 'queue2',
          '!', 'm.', 'webmmux', 'name=m', 'streamable=true',
          '!', 'tcpclientsink', 'host=localhost',
          'port='+tcpServer.address().port];
    
  11. hi can you help me with this error below?
    “options” seems to be always null but i can’t launch your example

    TypeError: “options” argument must be an object
    at normalizeSpawnArguments (child_process.js:321:11)
    at Object.exports.spawn (child_process.js:366:38)
    at Server. (../script.js:48:26)
    at Server.g (events.js:286:16)
    at emitNone (events.js:86:13)
    at Server.emit (events.js:185:7)
    at emitListeningNT (net.js:1279:10)
    at _combinedTickCallback (internal/process/next_tick.js:71:11)
    at process._tickCallback (internal/process/next_tick.js:98:9)

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s