A quick tip for those, like me, who adore freebies. Add this URL to your favorite RSS feed reader, such as Google Reader.
The UDP protocol does not guarantee that datagrams will be sent in the right order or even that they will arrive, hence the need for a protocol like RTP/RTCP. If you do want to use UDP then this post shows the commands required to move VP8 video.
gst-launch -v videotestsrc horizontal-speed=1 ! vp8enc ! udpsink host=localhost port=9001
gst-launch udpsrc port=9001 reuse=true caps=video/x-vp8,width=320,height=240,framerate=30/1,pixel-aspect-ratio=1/1 ! vp8dec ! ffmpegcolorspace ! autovideosink
You can run several instances of the command above, but only the last instance plays, the previous instances stop playing. Once you kill the last instance the one before that resumes video playback. I haven’t figured out why, but I suspect it has go to do with how UDP messages are delivered.
You can do the same with an audio stream, using RTP. I am not aware of how that may be done using raw UDP.
In this post, we’ll stream live WebM video to the browser using just GStreamer and Node.js. In a previous post we did it using Flumotion. Follow the procedure mentioned in that post to setup GStreamer 0.10.32 (or later). We’ll use Node.js with the express middleware. We have used that previously to do on-demand streaming of a WebM file.
We spawn a GStreamer pipeline to mux a WebM stream, and stream it to the TCP client sockets using
tcpserversink element. We receive a request from the browser at port 8001, create a TCP client socket to listen to a WebM stream, and stream all data received from that socket to the browser. The code follows
Assuming you have saved the script above to a file called script.js, run Node.js thus:
Now, you can play the WebM stream in Chrome by accessing
If you want to trace all system calls, especially if you change the args to GStreamer and get a cryptic message like
execvp(): No such file or directory
You can execute Node.js with strace
strace -fF -o strace.log node livewebm.js
Video and audio source elements
Here’s a list of alternative video source elements
- autovideosrc – automatically detects and chooses a video source
- ksvideosrc – video capture from cameras on Windows
- v4l2src – obtains video stream from a Video 4 Linux 2 device, such as a webcam
- ximagesrc – video stream is produced from screenshots
Here’s a list of alternative audio source elements
- autoaudiosrc – automatically detects and chooses an audio source
- alsasrc – captures audio stream from a specific device using alsa
- pulsesrc – captures audio stream from the default mic, based on system settings
An important point to note is that all these sources are live sources. GStreamer defines live sources as sources that discard data when paused, and produce data at a fixed rate thus providing a clock to publish this rate.
Node.js can be used to stream arbitrary data to a browser such as Chrome, over HTTP. In this post we’ll use latest version of the express middleware to stream a WebM file to the browser.
Execute the following npm command to install express
sudo npm install express@latest
npm installs express to a folder called node_modules, under the current folder. If you run node in the current folder, it should be able to find express.
Create a a file called webm.js with the following code
The commented headers in the response may be used for additional control. The
Transfer-Encoding header may also be
identity, its default value, as long as the
Connection response header is
Connection header is
Transfer-Encoding has to be
chunked. This behavior may be browser specific, I have only tested with Chrome. Chunking is taken care of by Node.js.
Running the code
To stream a WebM file at
/home/user/file.webm invoke node like
node webm.js 9001 /home/user/file.webm
Then, point Chrome to
http://host:9001/, and the video should begin playing.
Doing it the easy way
Now that we have seen the hard way, express has a method on the response object to send a file. It basically is a replacement for all the code that exists in the app.get() callback above:
This is how you can replace the default dissector for the IP protocol
local dissector_table = DissectorTable.get("ethertype") if dissector_table ~= nil then dissector_table:add(0x800, p_myproto) end
If you have a capture file with a different link layer, then you may want to read How to Dissect Anything.
To test your dissector, you can convert binary representation of a message to pcap using
od -Ax -tx1 -v myproto.bin > myproto.hex text2pcap -l 147 myproto.hex myproto.pcap
Valid values of link type specified using option -l are in the range 147 to 162.
Next, customize the DLT_USER protocol preferences, so that your dissector gets invoked for link type 147, as shown below
You don’t have to edit protocol preferences manually. You can achieve the same from a Lua dissector as follows
local wtap_encap_table = DissectorTable.get("wtap_encap") wtap_encap_table:add(wtap.USER0, p_myproto)
This is a quick post to record how Chrome requests a WebM stream, how an HTTP server, such as Flumotion, responds to that request, and the stream format.
To begin with, when Chrome (I tested with version 10) encounters a video tag with a WebM source, it sends the following request:
GET /webm-audio-video/ HTTP/1.1 Host: 192.168.2.2:9001 Connection: keep-alive Accept: */* User-Agent: Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.205 Safari/534.16 Accept-Encoding: gzip,deflate,sdch Accept-Language: pt-BR,pt;q=0.8,en-US;q=0.6,en;q=0.4 Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3 Range: bytes=0-<
The GET URI and Host header obviously will vary in each case. One thing to note is the Range header, a 0- means the server should return all data. If you sniff this message exchange using Wireshark, you will note that this request goes to the HTTP host and port specified in the source URL. The server responds from a socket bound to any other randomly chosen port. That is how TCP works.
The response from a server to the above request may look like:
HTTP/1.0 200 OK Date: Mon, 18 Apr 2011 18:37:20 GMT Connection: close Cache-control: private Content-type: video/webm Server: FlumotionHTTPServer/0.8.1
Usually the server also starts streaming the WebM data at this point. I say usually because at this point Chrome closes that TCP connection and makes the same request again. Why it does that is beyond me, but I’ll hazard a guess that it has to do with making sure that the server URL is correct and the server responds properly.
As a side note, I used the Node.js TCP proxy I posted about earlier, to sniff the messages above.
To get into the right spirit for a performance improvement initiative, you’ll need to:
- Learn – what the problem really is, read about it
- Test and Measure – when and where bad performance hurts most, focus on the hot-spots, rinse and repeat
- Benchmark – why is there a problem
- Patterns and Practices – how not to repeat the same mistakes
Test and Measure
As Tom DeMarco said, “You can’t control what you can’t measure”. So test, measure, and learn where the problems are. The following tools and measures can be very useful for diagnosis:
- Memory profilers
- Code execution profilers
- Performance counters
Benchmarks on the following aspects can be very useful:
- Memory utilization
- Memory I/O
- Network I/O
- Processor utilization
- Storage I/O
Write standard routines to benchmark above aspects, save results in operations per second (ops). Reuse the same routines across different hardware and software versions.