I've been looking at Promise/A+ as a way to abstract future values, or values that depend on something the user will do later, specifically Yehuda Katz's rsvp.js, and came across Domenic Denicola's You're Missing the Point of Promises, which got me looking at Oliver Steele's Minimizing Code Paths in Asychronous Code. Now I have another consideration in my code that I never thought about before.

Javascript is single-threaded (see John Resig's article and Olav Kjær's). That means that we generally don't have to worry about the concurrency issues that plague languages with threading (I remember trying to understand Java's Threads). However, it means that Javascript fakes concurrency with what we used to call when programming the original Macintosh "cooperative multitasking": one bit of code (call it a "code unit" in the dev.opera's terms) runs until it is done, then lets the next bit of code run. The programmer has to be sure that each code unit is small enough not to delay any other code, especially code that reacts to user input.

Javascript makes it look like multiple things are going on concurrently by using an event loop. There's a queue of code units, and every user event like a mouse click, or timer event, or AJAX response, puts its handler onto the queue. The Javascript interpreter just runs each code unit in turn. Note that the event generators themselves are run on separate threads; the event queue is building up while the current code is running. It's just that none of the handlers get run concurrently.

Putting a code unit on the queue is called asynchronous (meaning "not coordinated in time"), as opposed to normal function calls which interrupt the flow of code then return, which is called synchronous. One other consequence of the separate code units is that exceptions stop at each code unit; they don't interrupt the event loop itself. So:

<script>
console.log(1);
throw('a');
console.log(2);
</script>
<script>
console.log(3);
</script>

displays (in Chrome; browsers may differ in how they word uncaught exceptions and when those are displayed):

1
Uncaught a
3

Note that console.log(3) still runs; each <script> is its own code unit.

The fact that asynchronous code is run sequentially means that:

<script>
console.log(1);
setTimeout(function() {
	console.log(2);
	throw('a');
}, 0);
console.log(3);
throw('b');
console.log(4);
</script>

displays:

1
3
Uncaught b
2
Uncaught a 

The setTimeout is a separate code unit put on the queue immediately (after "delay" of 0 ms) but, even if the current code takes a long time, will never run until after the current code is done and console.log(3) is displayed. Note that the thrown exceptions only interrupt the code unit running at that time.

One subtlety is that simple busy waiting (which is usually a bad idea anyway) never works:

<script>
console.log(1);
var done = false;
setTimeout(function() {
	console.log(2);
	done = true;
}, 10);
console.log(3);
for (;;){
	if (done){
		console.log('done');
		break;
	}
}
console.log(4);
</script>

displays

1
3

then waits forever, even though the 10 ms has passed and the setTimeout handler has been placed on the queue. Firefox detects the infinite loop and eventually gives the "busy script" alert; Chrome just waits until you kill it.

I think we were all burned when first learning AJAX with:

<script>

var data = undefined;

function get(url){
	var request = new XMLHttpRequest();
	request.open("GET", url, true);
	request.onload = function(){
		data = request.responseText;
	}
	request.send();
}

get('generate.php');
console.log(data);
</script>

And, no matter how long it was since the XMLHttpRequest was sent, data was always undefined. The onload function is run asynchronously. So now we know to use continuation-passing style (known in the Javascript world as "callbacks"):

<script>

function get(url, callback){
	var request = new XMLHttpRequest();
	request.open("GET", url, true);
	request.onload = function(){
		callback(request.responseText);
	}
	request.send();
}

get('generate.php', function(data){
	console.log(data);
});
console.log(1)
</script>

Which works correctly, but the console.log(data) is asynchonous; the 1 is displayed before the data, again, no matter how fast the XMLHttpRequest is or how slow our code is.

The subtle bug-inducing subtlety is when we the code might be asynchronous or synchronous. For instance, we use a cache for the AJAX data:

<script>
var data;

var cache;
function getwithcache(url, callback){
	if (cache !== undefined) return callback('cached data');

	var request = new XMLHttpRequest();
	request.open("GET", url, true);
	request.onload = function(){
		cache = request.responseText;
		callback(request.responseText);
	}
	request.send();
}

getwithcache('generate.php', function(data){
		doSomething(data);
});
console.log(data);
</script>

The first time getwithcache runs, callback runs asynchronously, so line 20 displays undefined.

If the cache has been defined (after a successful XMLHttpRequest on line 11), then line 6 runs the callback synchronously, meaning that the console.log(data) on line 20 displays that data.

In addition, if the callback throws an exception, in the first case that will not interrupt the rest of the program but in the second case it will.

This unpredictability may make for bugs that are hard to track down, since they depend on apparently random factors. Steele's advice is to make sure that if code might run asynchronously, then it always runs asynchronously. Promise/A+ guarantees that, at the cost of slower code (the handlers never run immediately).

I don't know that I've ever been bitten by this bug but one place where it may come up is in event handling. Even though event handlers are run asynchronously, triggering them directly is synchronous as far as I can tell:

<button>Click</button>
<script>
var button = document.querySelector('button');
button.addEventListener('click', function(){
	console.log('in click handler');
});

console.log(1);
button.dispatchEvent(new CustomEvent('click'));
console.log(2);
</script>

displays

1
in click handler
2 

even though it ought to be 1 then 2 then in click handler if it were asynchronous. It does have some exception-catching protection, since throwing an exception in the event handler does not bubble up to the enclosing code. Using jQuery's trigger is also synchronous (but doesn't stop the exception).

This means that a given handler may be called synchronously or asynchronously, depending on whether the event was generated by the user interface or was synthetic. Just one more thing to keep in mind in our ever-more-complicated Javascript applications.

3 Comments

  1. Hacking at 0300 : Promises API says:

    […] in time for me to start thinking about Javascript Promises, comes along an official API, along with a polyfill that basically […]

  2. Hacking at 0300 : Bitten by the asynchronous bug says:

    […] luck would have it, right after I wrote about synchronous vs. asynchronous event handlers, I found exactly that problem in by bililiteRange code. It uses dispatchEvent to fire an input […]

  3. Hacking at 0300 : Promises and Event Handers says:

    […] There's a subtlety in using Promises with event handlers or any code that's executed asynchronously. […]

Leave a Reply


Warning: Undefined variable $user_ID in /home/public/blog/wp-content/themes/evanescence/comments.php on line 75