Real-time Web Apps with Server-Sent Events (pt 1)
Recently I've been researching how to build real-time web applications, where content is pushed to clients rather than them having to poll or refresh the browser. A lot of people at this point jump straight to websockets. However, these can often be more powerful than you need. Websockets provide a rich protocol to perform bi-directional, full-duplex communication. These are great when you want to do multicast communication between many-to-many clients. If all you need to achieve is one-to-many multicast from the server then Server-Sent Events (SSEs) are a powerful and simpler alternative. SSEs are sent over traditional HTTP; which means you can use a standard webserver rather than getting a websockets server.
Browser Support
At first glance of the CanIUse website it would appear that websockets have better browser support than SSEs. However, there are many polyfills to enable SSEs to function in unsupported browsers.
Example Implementation
Pre-Requisites
The following technologies are required for this demo application.
Build Interface
index.html
<!DOCTYPE html>
<html lang="en">
<head>
<title>Realtime Demo</title>
<link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.4/css/bootstrap.min.css" />
</head>
<body>
<h1>Realtime Demo</h1>
<ul id="live-updates"></ul>
<script src="https://code.jquery.com/jquery-2.1.4.min.js"></script>
<script>
var live = {
init : function() {
var source = new EventSource("http://localhost:8081/api/updates");
source.addEventListener("message", function(event) {
var data = jQuery.parseJSON(event.data);
live.addItem(data.update);
}, false);
},
addItem : function(data) {
$(live.constructItem(data)).hide().prependTo("#live-updates").fadeIn(1000);
},
constructItem : function(data) {
return "<li>" + data + "</li>";
}
};
$(document).ready(function(){
live.init();
});
</script>
</body>
</html>
The main things to focus on here are the creation of an EventSource
object.
var source = new EventSource("http://localhost:8081/api/updates");
This creates an open connection to the updates URI. This will be the end point that will be serving our SSEs. Which we can now consume by attaching a handler to the message event.
Now we need to write our Node end point to publish messages.
Server-Sent Events Publishing
app.js
var express = require("express"),
mustacheExpress = require("mustache-express"),
dataChannel = require("./custom_modules/data-channel"),
bodyParser = require("body-parser"),
app = express();
app.engine('html', mustacheExpress());
app.set('views', './views')
app.set('view engine', 'html');
app.use(express.static("./static"));
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({extended: true}));
app.get("/api/updates", function(req, res){
initialiseSSE(req, res);
});
app.get("/api/post-update", function(req, res) {
res.render("postupdate", {});
});
app.put("/api/post-update", function(req, res) {
var json = JSON.stringify(req.body);
dataChannel.publish(json);
res.status(204).end();
});
function initialiseSSE(req, res) {
dataChannel.subscribe(function(channel, message){
var messageEvent = new ServerEvent();
messageEvent.addData(message);
outputSSE(req, res, messageEvent.payload());
});
res.set({
"Content-Type": "text/event-stream",
"Cache-Control": "no-cache",
"Connection": "keep-alive",
"Access-Control-Allow-Origin": "*"
});
res.write("retry: 10000\n\n");
}
function outputSSE(req, res, data) {
res.write(data);
}
function ServerEvent() {
this.data = "";
};
ServerEvent.prototype.addData = function(data) {
var lines = data.split(/\n/);
for (var i = 0; i < lines.length; i++) {
var element = lines[i];
this.data += "data:" + element + "\n";
}
}
ServerEvent.prototype.payload = function() {
var payload = "";
payload += this.data;
return payload + "\n";
}
var server = app.listen(8081, function() {
});
Key parts of this to look at are the headers required to make the SSEs work correctly.
res.set({
"Content-Type": "text/event-stream",
"Cache-Control": "no-cache",
"Connection": "keep-alive",
"Access-Control-Allow-Origin": "*"
});
The first three headers are mandatory, but the Access-Control-Allow-Origin
is optional, and is how you can control cross domain access with CORS.
Next is the construction of the SSEs. Colin Ihrig does a fine write up of the server side of server-sent events, which I used as a resource to put this together.
data-channel.js
var redis = require("redis");
module.exports.subscribe = function(callback) {
var subscriber = redis.createClient();
subscriber.subscribe("liveupdates");
subscriber.on("error", function(err){
console.log("Redis error: " + err);
});
subscriber.on("message", callback);
};
module.exports.publish = function(data) {
var publisher = redis.createClient();
publisher.publish("liveupdates", data);
};
Here we are just utilising the Pub/Sub functionality of Redis, which is really simple as you can see.
To see this working just download the full working application from GitHub. Then browse to http://localhost:8082/api/post-update/ and fill in the form while browsing http://localhost:8081 and you will see the events updating in realtime.
Conclusion
So as you can hopefully see this is really pretty straightforward. Node + Redis hugely simplifies the server-side functionality, and the clientside integration is uncomplicated.
I'll be doing a follow up post on how to handle the reconnection.
By Simon Baynes