What's Old is New Again

We took the the trouble to describe batch computing in some detail because in 2004 this style of user interface has been dead for sufficiently long that many programmers will have no real idea what it was like. But if some of the above seems nevertheless familiar, it may be be because many of the behavioral characteristics of batch systems are curiously echoed by a very modern technology, the World Wide Web. The reasons for this have lessons for UI designers.

JavaScript, Java, and Flash support limited kinds of real-time interactivity on web pages. But these mechanisms are fragile and not universally supported; the Common Gateway Interface — Web forms — remains the overwhelmingly most important way for web users to do two-way communication with websites. And a Web form fed to a CGI behaves much like the job cards of yesteryear.

As with old-style batch systems, Web forms deliver unpredictable turnaround time and cryptic error messages. The mechanisms for chaining forms are tricky and error-prone. Most importantly, web forms don't give users the real-time interactivity and graphical point-and-shoot interface model they have become used to in other contexts. Why is this?

Batch systems were an adaptation to the scarcity of computer clock cycles; the original computers had none to spare, so only a bare minimum went to impedance-matching with the brains of humans. Web forms are primitive for an equally good reason, but the controlling scarcity was one of network bandwidth. In the early 1990s when the Web was being designed, the cabling and switching fabric to support millions of real-time-interactive remote sessions spanning the planet did not exist. The deficit wasn't so much one of bandwidth (available bits per second of throughput) but of latency (expected turnaround time for a request/response).

The designers of CGI knew most of their users would be using connections with serious latency problems, on communications links that often dropped out without warning. So they didn't even try for real-time interactivity. Instead, the interaction model for the Web in general and web forms in particular is a discrete sequences of requests and responses, with no state retained by the server between them.

The batch-to-CGI correspondence is not perfect. The batch analog of dropped connections — permanent interruptions in the act of feeding cards into a machine, as opposed to just unpredictable delays — was relatively rare. And one of the reasons the CGI model is stateless on the server side is because retaining even small amounts of session state can be cost-prohibitive when you might have thousands or millions of users to deal with daily, not a problem batch systems ever had. Still, the analogy does help explain why the Web was not designed for real-time interactivity.

Today, in 2004, it is largely demand for the Web that has funded the build-out of the Internet to the point where massive real-time interactivity is thinkable as more than a pipe dream. We're still not there; latency and bandwidth constraints are still severe, as anyone who has watched the slow and stuttering progress of a video download can attest.

The lesson here is that the batch processing style is still adaptive when latency is large and unpredictable. We may almost solve that problem for the planetary Web in the next few decades, but there are fundamental physical reasons it cannot be banished entirely. The lightspeed limit is perhaps the most fundamental; it guarantees, among other things, that round-trip latency between points on the Earth's surface has a hard lower bound of a bit over a seventh of a second. [7] In practice, of course, switching and routing and computation add overhead. Nor would it be wise to assume that the Internet will forever remain limited to Earth's surface; indeed, satellite transmission has been handling a significant percentage of international traffic since the 1970s.

The command-line style has also persisted, for reasons we discussed in depth in [TAOUP]. It will sufficient to note here that the reasons for the survival of this style are not just technical constraints but the fact that there are large classes of problems for which textual, command-line interfaces are still better-suited than GUIs. One of the distinguishing traits of Unix programmers is that they have retained sophisticated styles of command-line design and already understand these reasons better than anyone outside the Unix tradition, so we will pass over pro-CLI arguments lightly in this book.

There is a subtler lesson to be drawn from these survivals. In software usability design, as in other kinds of engineering, it is seldom wise to dismiss an apparently clumsy or stupid design by assuming that the engineers of bygone days were idiots. Though engineers, being human, undeniably are idiots on occasion, it is far more likely in the normal course of events that a design you find ridiculous after the fact is actually an intelligent response to tradeoffs you have failed to understand.

[7] A light-second is just shy of 300,000 kilometers (precisely, 299792.453684314). The circumference of the Earth is a hair over 40,000 kilometers (precisely, 40,076). Halve the Earth's circumference for the length of the optimal path between antipodal points; double it because we're talking about a round trip. Optimistically assume that switching and routing add no latency. The hard lower bound is then 0.133 seconds, which is by coincidence just about the minimum response time of a human reflex arc.