Monthly news review


Continuing the series on monthly, predominantly technological, news review…

CarPlay

iOS has arrived in the car, at least the premium car, not that it wasn’t there already. The experience is based around your iOS device and makes driving as distraction-free as possible.

Popcorn Times

Popcorn Times was born, died and re-born. Watching pirated movies online is nothing new, make it convenient and you’ve got a winner. What makes the application itself interesting is its use of Node-WebKit and peerflix.

Android Wear

No doubt now that Google is taking Android to wearables, starting with smartwatches. These days it looks like everyone is one-upping Apple, but somehow it ends up raking in all the profits.

WebScaleSQL

That’s a nerdy name for a new fork of MySQL that scales better.

Facebook buying Oculus VR

Looks like Facebook is making use of its cash reserves again. Earlier, they paid significantly more to buy WhatsApp.

Office for iPad

Lot’s of Office news this month, concluding with Microsoft launching Office for iPad. It is already top of charts in several countries. Earlier Microsoft launched a free OneNote app for Mac. I wonder which Office app is most useful, now that everyone concurs PowerPoints should be ditched.

ReadMill team acqhired by Dropbox

ReadMill was my favorite reading app for a while. Unfortunately, and especially on mobile, e-books are not easily ported to other apps.

Philips announces 4K TV with Android

TV makers have adopted widely divergent Smart TV OS strategies. LG has embraced WebOS. Samsung has embraced something akin to Chrome OS with support for HTML 5, Native Client, and WebGL. Will Android take over?

CorelDRAW Graphics Suite X7

CorelDRAW has announced the X7 iteration of their suite. The user interface has received a significant overhaul. Heavy users will encounter a crash every other day that will make them lose their work, that has not changed.

Quake III on Raspberry Pi using open source graphics driver

Last month Broadcom announced open source drivers for the GPU on Raspberry Pi. Simon Hall has thus claimed the bounty announced by Raspberry Pi Foundation. In unrelated news, Oxford Flood Network uses Raspberry Pi to monitor flooding levels. Interesting convergence of ideas such as smart cities, internet of things, and open hardware.

Learning Node by Shelley Powers; O’Reilly Media


Learning Node by Shelley Powers is timely, and should be on the reading list of every JavaScript and server-side developer. I have used Node on Linux for x86 and ARM, and Windows. Its performance, especially for network-intensive applications, has me astounded.

I’ll briefly delve into things that stood out to me in each chapter of the book.

Chapter 1 is must read if you don’t understand the asynchronous nature of Node. It also covers building it from source for Linux, and using WebMatrix to develop and run Node applications with IIS.

Chapter 2 shows how to use command line REPL (read-eval-print loop) to quickly test code, inspect objects, and as an editor. Imagine that!

Chapter 3 covers the Node core objects and modules. In particular, the global namespace object, process.nextTick to asynchronously execute a callback, util.inherits to implement inheritance, and EventEmitter to emit events.

Chapter 4 covers the Node module system. Covers require and how it searches for modules (.js, .node or .json), delete require.cache to reload a module from source, how to create your own custom module, and expose its objects and functions using export. It also covers often used modules such as npm (installed with Node) for package management, Optimist for options parsing, and Underscore.

Chapter 5 delves deeper into the asynchronous nature of Node, covering control flow, exception handling, and asynchronous patterns. It then discusses the Step and Async modules that implement those patterns. It also briefly discusses Node coding style.

Chapters 6, 7, and 8 discuss web development middleware and frameworks such as Connect and Express, and templating modules that work in tandem with Express, such as EJS and Jade.

Chapters 9, 10, and 11 discuss the different means of persisting data, in a key-value store such as Redis, document-centric database such as MongoDB, or a relational database such as MySQL, either directly or using the Sequelize ORM.

Chapter 12 discusses manipulating PDF by executing external tools such as PDF Toolkit, creating drawings using the canvas module, and streaming videos.

Chapter 13 discusses the popular Socket.IO library that you can leverage for bidirectional communication between server and the Browser.

Chapter 14 discusses unit testing, acceptance testing, and performance testing. Tools and modules covered include Apache Bench (ab), nodeunitSelenium, and soda. Also discussed is the nodemon module that can be used to restart the application when a script is changed.

Chapter 15 discusses TLS/SSL and HTTPS for securing data communication, saving password as hash using the crypto module, authentication using the passport module, and authentication with Twitter using the passport-twitter Passport strategy module. It also discusses writing secure code by avoiding eval, validating data using a module such as node-validator, and running external scripts in a sandbox using the vm module.

Chapter 16 discusses deployment of applications to a server, or to the various cloud services such as Azure (using Cloud9 IDE), Joyent, Heroku, Amazon EC2, and Nodejitsu. It discusses modules such as Forever to recover from crashes, and integration with Apache. The discussion on clustering with Node is very brief and does not discuss the experimental cluster module.

I am glad the author took the time to write this book, I am a better “Noder” because of it. I’d like to thank O’Reilly Media for giving me the opportunity to review this book as part of the blogger review program.

Node.js and Sqlite3


Node.js and Sqlite3 can be used as a foundation for apps that are cross-platform, browser-based (leverage HTML5), and network-intensive. In this post I comment about some of the work required in building such a foundation.

Install sqlite3 for Node.js on Windows

The installation procedure for the sqlite3 module is slightly complex and requires that you have installed the python interpreter and the VC++ compiler. Then just head over to the command prompt and run:

npm install sqlite3 --arch=ia32

You can add the -g option after npm if you want to install to the global node_modules folder. The arch option is required for the module to work on Windows 7 64-bit. Without it you’ll get a cryptic message like:

Error: %1 is not a valid Win32 application

Hopefully, at some point pre-compiled binaries will be provided for Windows.

Opening or creating a database

I like to maintain a database creation script. The code below is a simple example of how I detect and execute the creation script.

var fs = require('fs');
var sqlite3 = require('sqlite3').verbose();

fs.exists('database', function (exists) {
  db = new sqlite3.Database('database');

  if (!exists) {
    console.info('Creating database. This may take a while...');
    fs.readFile('create.sql', 'utf8', function (err, data) {
      if (err) throw err;
      db.exec(data, function (err) {
        if (err) throw err;
        console.info('Done.');
      });
    });
  }
});

Here’s how create.sql may look like:

CREATE TABLE customer (
  id INT NOT NULL,
  CONSTRAINT PK_customer PRIMARY KEY (id ASC)
);

CREATE TABLE sale (
  id INT NOT NULL,
  CONSTRAINT PK_sale PRIMARY KEY (id ASC)
);

Embedding Node.js

It is rather convenient if you can package Node.js and its modules in a single installer. I discussed this in a post about the Wix Toolset recently. I use npm without the -g option to download and install all modules in a node_modules folder. I place the single executable version of Node.js in the same folder. The installer just needs to package that folder, and you have a Windows-specific package of Node.js and your application’s module dependencies ready to install.

TCP socket connection from the web browser


Web browsers do not support communicating with TCP hosts, other than web servers. In this post I take a different tack. I demonstrate a relay written with Node.js, that receives data from the browser over websockets, and sends it to a TCP socket. Data received over the TCP socket is similarly relayed back to the browser. This approach can also be used with UDP and other IP protocols.

JavaScript implementations of most modern browsers have typed arrays, that can be used to manipulate binary data. Latency and performance of JavaScript are important factors to consider. Some hosts may have tight timing requirements for responses that may be hard to meet.

The websockets implementation used by the relay is based on the ws module. socket.io is also a good fit but I wanted to be as close to vanilla websockets as possible. The ws module can be installed as follows:


npm -g install ws

The client

Here’s the implementation of a test client. It requests the relay to open a new socket connection to http://www.google.com at port 80. It then sends an HTTP GET request, and shows the response to the GET request in a DIV element.

You’ll need some familiarity with jQuery to follow the code. Since I use the CDN hosted version of jQuery, you’ll need an internet connection.

<html>
<head>
  <title>Test Client</title>
</head>
<body>
  <div id="output">Output</div>

  <script src="http://code.jquery.com/jquery-1.7.2.min.js"></script>

  <script>
  $(document).ready(function() {
    var config = {
      relayURL: "ws://192.168.0.129:8080",
      remoteHost: "www.google.com",
      remotePort: 80
    };

    var client = new RelayClient(config, function(socket) {
      socket.onmessage = function(event) {
        $('div#output').html(event.data)
      };
      var get = 'GET / HTTP/1.1\r\n\r\n';
      //var get = new Blob(['GET / HTTP/1.1\r\n\r\n']);
      socket.send(get);
    });
  });

  function RelayClient(config, handler) {
    var connected = false;
    var connectHandler = handler;

    var socket = new WebSocket(config.relayURL);

    socket.onopen = function() {
      socket.send('open ' + config.remoteHost + ' ' + config.remotePort);
    };

    socket.onmessage = function(event) {
      if (!connected && event.data == 'connected') {
        connected = true;
        handler(socket);
      }
    }
  }
  </script>
</body>
</html>

The relay

The relay implementation receives a message from the client containing an open request followed by the remote host and port. After a connection is established with the remote host, it sends a connected message to the client. After that, all messages from the client are simply relayed to the host, and vice-versa.

The ws module is used in tandem with the express web application framework. The express framework is setup to serve static files from the folder where the script is located.

var express = require('express');
var net = require('net');
var WebSocketServer = require('ws').Server;

var app = express.createServer();
app.use(express.static(__dirname));
app.listen(8080);

var wss = new WebSocketServer({server: app});

wss.on('connection', function(ws) {
  // new client connection
  var connected = false;
  var host = undefined;

  ws.on('message', function(message) {
    if (!connected && message.substring(0, 4) == 'open') {
      var options = message.split(' ');
      console.log('Trying %s at port %s...', options[1], options[2]);
      host = net.connect(options[2], options[1], function() {
        connected = true;
        ws.send('connected');
      });
      host.on('data', function(data) {
        console.log('Got data from %s, sending to client.', options[1]);
        ws.send(data);
      });
      host.on('end', function() {
        console.log('Host %s terminated connection.', options[1]);
        ws.close();
      });
    } else {
      console.log('Got data from client, sending to host.');
      host.write(message);
    }
  });
});

Some considerations

The data being sent and received is UTF-8. Blob support is currently limited to Firefox and desktop WebKit based browsers. The relay mechanism can be extended to support other protocols like UDP.

A simple read-only configuration module for Node.js


There are already a few configuration modules available for Node.js. I built myself a simple read-only configuration module for Node.js as a learning experience. The module reads configuration information from a JSON file like the following

{
    "db":
    {
        "host": "localhost",
        "port": "1433"
    }
}

The module, called config.js, is initialized by calling the init method. The callback function passed to the init method is called when the initialization is complete. All the objects in the JSON file, which we have called config.json, can be accessed from the module object.

var fs = require('fs');

var config = function() {
    // constructor
}

config.prototype.init = function(file, cb) {
    fs.readFile(file, function (err, data) {
        if (err) {
            console.log(err);
            cb(-1);
        } else {
            var json = JSON.parse(data);
            for (o in json) {
                config.prototype[o] = json[o];
            }
            cb(0); // no error
        }
    });
}

module.exports = new config();

This is how the module can be used.

var config = require('./config.js');
config.init('config.json', function(resp) {
    if (resp != 0) {
        console.log('Could not load config file.');
        return;
    }
    console.log('host: ' + config.db.host);
    console.log('port: ' + config.db.port);
});

Once initialized, the module can be required by any script file of your Node.js application.

Why you don’t want to use the module above

Node.js already provides a neat mechanism to read JSON, you can simply require the JSON file:

var config = require('./config.json');
console.log(config.db.port);

The module system caches the file, so subsequent requires don’t parse the file again. If you need to read the file after it has been modified, you’ll need to use require.cache to delete it before invoking require.

delete require.cache('./config.json');

You can invoke require.cache when the file changes by using the watchFile function exported by the File System module.

Node.js client to server with socket.io


Socket.io is usually employed by browser apps communicating with a Node.js server application. It is however possible to create a client in Node.js if you need to call the same server application. It is also possible for the server to return values by calling a function that the client passes to it.

You’ll need to install socket.io and socket.io-client using npm as shown below. Additionally, we also use express for serving static HTTP content.

npm -g install socket.io socket.io-client express errorhandler

This is how a client connection can be established. The namespace ns is used for communicating with the server. Client emits the event call with parameter p1, and a function parameter that receives a response code and additional data. Remember to export global node_modules folder in NODE_PATH before running the script with node e.g. export NODE_PATH=/usr/local/lib/node_modules/. npm -g list | head shows the path to node_modules.

var io = require('socket.io-client');
var serverUrl = 'http://localhost:8080/ns';
var conn = io.connect(serverUrl);
 
var p1 = 'hello';
conn.emit('call', p1, function(resp, data) {
    console.log('server sent resp code ' + resp);
});

This is how a server can serve the client above.

var http = require('http');
var express = require('express');
var app = express();
app.use(express.static(__dirname + '/'));
var errorhandler = require('errorhandler');
app.use(errorhandler()); // development only
var server = http.createServer(app);
 
var io = require('socket.io').listen(server);
var ns = io.of('/ns');
ns.on('connection', function (socket) {
    socket.on('call', function (p1, fn) {
	console.log('client sent '+p1);
        // do something useful
        fn(0, 'some data'); // success
    });
});

server.listen(8080);

Socket.io makes event-based communication really easy.

Broadcast to all sockets


The broadcast server example from the socket.io how-to is reproduced below:

var io = require('socket.io').listen(80);

io.sockets.on('connection', function (socket) {
  socket.broadcast.emit('user connected');
});

io.sockets.emit('to all');

The user connected event will be broadcast to all sockets except the one that just connected. The to all event is broadcast to all the sockets.

You can also broadcast some event to all the sockets in a namespace. For example:

var io = require('socket.io').listen(80);

io.sockets.on('connection', function (socket) {
  socket.broadcast.emit('user connected');
});

var ns1 = io.of('/ns1');
ns1.on('connection', function (socket) {
  socket.broadcast.emit('user connected');
});

ns1.emit('to all');

The to all event is emitted to all the sockets in ns1. If you were to emit using io.sockets.emit instead, the event would not be received by any of the sockets in ns1.

The client script that receives messages for namespace ns1 may look like this:

<script>
  var ns1 = io.connect('http://localhost/ns1');
  
  ns1.on('to all', function () {
    console.log('to all');
  });
</script>