Wednesday, September 24, 2014

It's the Age of Asynchronous

I'm not sure how many people have noticed this yet, but Firefox 30 deprecated synchronous XHR on the main thread. They said that it leads to a degraded user experience, and they're correct - in most cases. But I've used it on occasion without any serious side effects, I just had to be careful about what I was doing.

Still, I should probably stop doing it. How can I call synchronous XHR Ajax if the A in Ajax stands for Asynchronous? Take away the A and it's just Jax. Not cool. Of course, the X stands for XML and most people use JSON instead these days. But what is Ajaj? That doesn't even make any sense! Let's move on...

A common use of the synchronous XHR is to ensure order of operations. If you have to make calls to get data from 2 URLs before doing some kind of processing, you can't guarantee which one will return first. If you can't use synchronous XHR anymore, you could stack up the calls inside callbacks so you only ever do one at a time...but then you're still only doing one at a time. If one call takes 3 seconds and the other takes 2, you have a total wait time of 5 seconds before you can continue. If you could somehow call both at once but still make sure the results are processed in a certain order, your total wait time is only 3 seconds. Saving 2 seconds might not sound like much, but it can really add up. And when you have a human being waiting for some on-screen response, 2 seconds is an eternity. Plus, performing the calls sequentially in callbacks might be fine for 2 or 3 calls, but it gets crazy when you need to stack 5 or 6 callbacks deep.

There must be a better way.

There is! If you use jQuery, the when().done() functions can handle the difficult task of making multiple asynchronous calls and keeping the responses in order.

The when() function takes as arguments as many Deferred or Promise objects as you want to pass it. When all of them have completed, the function passed to done() is called with the results of the Deferreds as arguments in the order they were given to when().

Since Ajax calls return Promises in jQuery, they can be used as arguments to when(). For Ajax Promises, the results passed to the function given to done() will be 3-element arrays. The first element in the array will be the returned data, the second will be the status text, and the third will be the jqXHR object. I've thrown together a working example of this...

Here is my test HTML page. The JavaScript starting on line 20 is the important part.

<!doctype html>

<html lang="en">
<head>
  <meta charset="utf-8">
  <title>Async Order Test</title>
</head>

<body>

  <p>when().done() test page</p>

</body>

<script src="https://code.jquery.com/jquery-1.11.1.min.js"></script>
<script>

$(function() {

    $.when(

        $.get('something/a'),
        $.get('something/b')

    ).done(function(a, b) {

        console.log(a);
        console.log(b);

    });

});

</script>

</html>

And here is my Node/Express server app.

var express = require("express");
var app = express();

app.get("/something/a", function(req, res, next) {
    // simulate a long response time for this path.
    // we won't send the response for 3 seconds.
    setTimeout(function() {
        res.status(200).send("this is a");
    }, 3000);
});

app.get("/something/b", function(req, res, next) {
    res.status(200).send("this is b");
});

// ------------ static content
app.use(express.static("public"));

// ------------ start the listening
var server = app.listen(3000, function() {
    console.log("listening on port %d", server.address().port);
});

I've set up the /something/a resource to take a long time to respond. The /something/b resource should respond as quickly as possible. Therefore "b" should pretty much always return first even though it is second in the list passed in to when(). Looking at the console output, it is clear that the "a" response is printed out first despite returning last.


The when() function can be used for other types of Deferreds as well - it is not limited to Ajax calls although these are perhaps the most common uses.

If you're not using jQuery (perhaps you have a similar issue on the server side in Node), there are other libraries available that provide similar functionality. Check out Async.js for a good one. If you are a hardcore computer science fanatic, you could implement something yourself. Here is an example I quickly threw together for doing it in Node...

var http = require('http');

function waitFor(urls, cb) {

    var countDown = urls.length;
    var retVals = new Array(countDown);

    function processResponse(idx) {
        return function(response) {
            var str = '';
            response.on('data', function(chunk) {
                str += chunk;
            });
            response.on('end', function() {
                fillInData(idx, str);
            });
        };
    }

    function fillInData(slot, data) {
        retVals[slot] = data;
        countDown--;
        if (countDown === 0) {
            cb(retVals);
        }
    }

    for (var i = 0; i < countDown; i++) {
        http.get(urls[i], processResponse(i));
    }

}

var urls = [ 'http://localhost:3000/something/a',
             'http://localhost:3000/something/b' ];

waitFor(urls, function(results) {
    console.log(results);
});

My example is not quite as flexible as what jQuery or Async.js offer, but you can see the basics of how it works. Given an array of URLs and a callback, my waitFor function will request data from all the URLs asynchronously but only call the callback when all the requests have completed. The array of data passed to the callback will be in the same order as the URLs were given. It works using a countdown latch mechanism. For each response received, the countdown in decremented. If the countdown reaches 0, all responses have come in and the callback can be called. The callback is passed an array that has been populated with each response's data. The data went in the proper array index because the index of the URL was passed along to to the function that handled the response. The only unusual thing is how the processResponse callback is given to the http.get function. Calling processResponse and passing an index actually returns an other function which is the real callback for http.get. The reason for this is to create a new closure for the index so that the response callbacks are not affected by the for loop continuing to run and incrementing the index. Without that extra function, the data from all responses almost always goes into the same position, 2, in the retVals array.

Working with lots of asynchronous functions can be confusing and frustrating sometimes. A good library like jQuery can help make it less painful.

Amphibian.com comic for September 24, 2014

No comments:

Post a Comment