This is something I've known for a while but tend to forget until it slaps me across the face. A reader wrote in with something odd she saw on her web site. She had a basic search site where some of the content was a bit slow to render. Instead of delaying the search results she simply used an Ajax call to update the results in real time. I think this is a great idea. Even if the "total" time remains the same (or close to it), the "perceived" time is much lower for the end user. However, in her testing she noticed something odd. She thought she was running N ajax based calls all at once but in her network tools they appeared to come in "chained", ie one after another. She had expected that if each call took about 1 second and she made 30 of them, they should run asynchronously and completely run within 1 second. (Or within a second or two given network latency randomness.) I whipped up a quick example of this so I could see this in action.

First, I began with a simple front end client that uses jQuery. This topic isn't jQuery specific of course but all good Ajax posts should mention jQuery at least once.

<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1/jquery.min.js"></script> <script> $(document).ready(function() { for(var i=1; i<= 20; i++) { runCall(i); } })

function runCall(x) { console.log('doing call '+x); $.post("slow.cfm",{"id":x}, function(res,code) { $("#show"+x).html("Result was "+res); console.log("back and res="+res); }); } </script>

<cfloop index="x" from="1" to="20"> <cfoutput> <div id="show#x#""></div> </cfoutput> </cfloop>

My template has 20 divs, representing search results, and on document load will run 20 calls to fetch data for each div. Warning - I'm using console.log for debugging. This is not a bug. When you click my demo, please do not report this as a bug. Ajax developers should use a browser (or browser+plugin) that supports console! All in all, pretty simple, right? Technically this is not "20 at once" since I have a loop, but I think we agree that it is close enough for our testing.

For the back end I wrote a quick ColdFusion script to simulate a slow process and result. It's hard to write slow ColdFusion code so I made use of the sleep command.

<cfset sleep(2000)> <cfparam name="form.id" default="1"> <cfoutput>#form.id#=#randRange(1,100)#</cfoutput>

Running this in the browser shows some interesting results. I tested in both Chrome and Firefox. While I prefer Chrome, I thought Firefox (plus Firebug) had the best graphic result:

I think you can clearly see that the results are staggered. You can test this yourself by clicking the button below - but have Firebug or Chrome Dev Tools ready to go:

For me, I clearly see "spurts" of loading. Typically I saw 2 or 4 results pop in at once. This is not a bug though and is completely normal. The browser has set limits on how many network requests can be made to a server. This is both good for the client as well as the server. We see this every day when loading a web page, especially over mobile. Things 'pop' into view over time. However on a broadband connection it can sometimes be easy to forget. In this very clear cut example we ask the browser to quickly make a bunch of network requests at once and we can see the 'throttle' more clearly.