Wednesday 27 July 2011

Knockout.js: Making an observable that fits into an undo system

Knockout is a JavaScript library that helps you to create rich, responsive display and editor user interfaces with a clean underlying data model. Any time you have sections of UI that update dynamically (e.g., changing depending on the user’s actions or when an external data source changes), KO can help you implement it more simply and maintainably.
(from the homepage)

I'm using knockout fairly extensively thoughout a new app I'm building. This app has a full undo / redo system for all user actions, and this caused a bit of a problem with knockout. By default knockout will immediately update the data model with the results of user actions. Furthermore, when it notifies you of changes to values, it only tells you the value that the data has changed to - and it's too late to find out what the value was before.

Of course there are hacky ways around this, but a much better solution suggested by (someone on a mailing list I need to look up again) was to create a 'subclass' of observable that allows access to the previous value of a variable.

Here's the code:
$("document").ready(function() {

/**
* An observable that allows access to the last value. Useful for undo
* purposes.
*
* Usage:
* viewModel.foobar = ko.lastValueObservable();
* viewModel.foobar.subscribe(function(val) { alert("foobar changed from " + viewModel.foobar.lastValue() + " to " + val); } );
*/
ko.lastValueObservable = function(initValue) {
var _value = ko.observable(initValue);
var _lastValue = initValue;

var result = ko.dependentObservable({
// Just return the real value
read: function() {
return _value();
},

write: function(newValue) {
// store the last value before writing...
_lastValue = _value();

_value(newValue);
}
});

// Add a new function to return the last value
result.lastValue= function() {
return _lastValue;
};

return result;
};
});

Tuesday 26 July 2011

HTML5: Transferring localStorage data between machines & browsers

An app I've been building recently makes extensive use of the HTML5 localStorage API, which is beautifully simple and just works. This system stores a limited (normally maximum 5MB) of data as name / value string pairs on a per-domain basis in the user's browser. Check out my other post on localStorage for more details

I came across an issue today where a colleague had saved some data locally that I needed to use, and it seemed there wasn't an easy way to transfer the data across. Luckily a few minutes of fiddling showed it was in fact super-easy.

Just write out the total contents of the localStorage file as a JSON string, email or otherwise transfer it to the other computer, and then read it back into localStorage.

Here's the code. I ran this by opening the JavaScript console in my browser and just typed the commands
// Write out the whole contents of the localStorage..
JSON.stringify(localStorage);
// Copy the output..

// Now on the other computer, read it in again..

var storage = !! PASTE THE DATA HERE !!;
for (var name in storage) { localStorage.setItem(name, storage[name] ); }

Monday 25 July 2011

Creating an XML-to-JSONP converter using node.js

A client of ours had a simple request to include an RSS feed in a mobile website and have it dynamically update. The obvious solution to this was to use AJAX, parse the XML and generate the required HTML.
However the RSS feed was on a different domain to the website itself, so we quickly run into 'traditional' cross-domain restrictions. Simply put, you can only AJAX data from the same domain that your website is served from.

Now, the JSONP format is widely considered a great way to get around this issue. The way this works is that a script tag is dynamically inserted into the HTML document. The src attribute of the script tag is set to the data source, e.g.

<script src="http://www.example.com/foo.js?callback=myCallback" type="application/javascript">


The 'callback' query string parameter is used to specify the name of a JavaScript function that will be executed when the data is downloaded. This function will be passed the data itself as the first parameter. E.g. in this example, we'd create a JavaScript function like this:

function myCallback(data) { console.log(data); }


The data parameter would then be a JavaScript object containing the data that was loaded. Pretty neat, I think you'll agree!

If you use the inspector in your browser to see what's actually downloaded, you'll see something like this:

myCallback( {  a:1, b:2, c:3 } );


It's simply the JSON-encoded data wrapped in a call to the function we specified in the query string parameter earlier on.

What we're going to do in the remainder of this article is to create a node.js server that we pass the URL of an RSS feed in & the name of the callback, and we get a JSONP response back again.

Now, going back to the requirements, our client asked for a RSS feed reader. Of course RSS is an XML-based format and not a JSON format so we're going to need to convert XML into JSON.

Happily, there's a module for doing exactly this called xml2js available via npm, the node package manager.

sudo npm install -g xml2js


The -g call tells npm to install the package globally - so it's available to all node apps.

We're also going to use journey, a node module that simplifies mapping request urls to js functions.

sudo npm install -g journey


Here's the code that sets up the server itself and creates the mapping for the url /xml2jsonp. This is the js file that should be run using the node binary.

server.js


require.paths.push('/usr/local/lib/node_modules'); // my installation required this for any modules to load.. your directory may be different..
var http = require('http'),
journey = require('journey'),
xml2Jsonp = require('./xml2jsonp'), // this is the file containing the implementation of the actual conversion.. We'll create this in a bit!

//======================================================================
// Create the routing table request => handling function
//======================================================================
var router = new (journey.Router);
router.map(function () {

// Send a welcome message if you hit the root
//==================================================================
this.root.bind(function (req, res) {
res.send("Welcome")
});

// map /xml2jsonp
//==================================================================
this.get("/xml2jsonp").bind(function (req, res, params) {

// the 'params' parameter is filled in with the query string as an object, provided your client includes a query string in the request url (?a=b&c=d)

// the xml2jsonp.get function is defined in the xml2jsonp.js file defined later on.
// we get out the 'url' and 'callback' parameters from the query string and pass them in.
xml2Jsonp.get(res, params.url, params.callback, function(result) {
// we've got back the text! use res.send to respond to our client!
res.send(200, {'Content-Type': 'application/javascript'}, result);
});
});
});

//======================================================================
// Set up the server
//======================================================================
http.createServer(
function (req, res) {

// This section is boilerplate code from the journey docs to read the request from the client and pass it onto the routing table above.
var body = "";

req.addListener('data', function (chunk) {
body += chunk
});
req.addListener('end', function () {
// Dispatch the request to the router
router.handle(req, body, function (result) {
res.writeHead(result.status, result.headers);
res.end(result.body);
});
});

}).listen(80);

console.log('Server running..');


Now here's the interesting bit, the implementation of the actual XML 2 JSONP conversion.

xml2jsonp.js

var
url = require("url"),
xml2js = require("xml2js"),
http = require("http");

/**
* Downloads the XML file, converts it to JSONP, and calls the callback function with the result.
*
* @param res the http response object
* @param xmlUrl the url of the xml we want to download
* @param callbackFunctionName the name of the JSONP callback function
* @param callback the function to run once this call succeeds. The first parameter will be the result as a string.
*/
exports.get = function(res, xmlUrl, callbackFunctionName, callback) {

if (!xmlUrl || !callbackFunctionName) {
callback(callbackFunctionName + "({error:'invalid parameters'}");
return;
}

// get query passed in..
var urlIn = url.parse(xmlUrl);

if (!callbackFunctionName) {
callbackFunctionName = "";
}

var options = {
host: urlIn.hostname,
port: 80,
path: urlIn.pathname,
"user-agent": "node.js" // some web servers require a user agent string..
};

// download the XML as a string..
downloadTextFile(options,
function success(data) {

// use the xml2js library to parse into a JS object.
var parser = new xml2js.Parser();
parser.addListener('end', function(result) {
// Use JSON.stringify to convert back into the JSON text we'll return. Note that in a production environment you'll
// probably want to omit the null, 4 part on this call. It makes the output more human-readable but increases the file size.
callback(callbackFunctionName + "(" + JSON.stringify(result, null, 4) + ");");
// Note here we're wrapping the JSON in a call to the JSONP callback function - this is the key part of the JSONP format!
});
parser.parseString(data);
},
function error(msg, e) {
// Note: because we're in an async environment it's important that we call the callback even if things fail, else the client will hang around waiting for a response.
callback(callbackFunctionName + '({error:' + msg + '})');
});
};

/**
* Call to download a text file from the specified url
* @param options {host:[host name],port:[port],path:[path] }
* @param success
* @param error
*/
function downloadTextFile(options, success, error) {
http.get(options,
function(res) {

console.log("Got response: " + res.statusCode);

var data = "";

res
.on('data', function (chunk) {
data += chunk;
})
.on('end', function() {
success(data)
});

})
.on('error', function(e) {
var msg = "Failed to download text from " + options.host + options.path + " " + e.message;
console.log(msg);
error(msg, e);
});
}


I'm hosting the above code on a joyent node.js smartmachine a (currently at the time of writing) free service from joyent. I thoroughly recommend trying it out if you're looking for free node.js hosting. It's a little fiddly getting started, but if you follow the docs at http://wiki.joyent.com/display/node/Getting+Started+with+a+Node.js+SmartMachine and ignore the many obsolete documents I kept finding on google then you should be ok.

Update: I've added this project to github, and updated it a little to also provide json to jsonp conversion.
https://github.com/benvium/nodejs-xml2jsonp