I’m typing this from 37,000 feet in the air. As is usually the case when riding in airliners the available Internet connectivity is very slow. More importantly, the latency is fairly high. This becomes a problem for many websites, many of which choke and take forever to do anything. Annoyingly what they finally do (generally just load some nice styled text) is way out of proportion to what could be done with the resources that have been given. The already-limited connection is horribly underutilized.
Of course none of these observations are new to anyone who’s been trying to work while flying high in the sky. But they illustrate a larger problem about networks and how we rely on them, especially in the context of websites.
The network is the worst of everything
Over the past few decades computers have gotten obscenely fast. CPU and RAM resources have shot up tremendously, both in how much you can fit in a single form factor and how much power you get for the money you expend. Networks have gotten faster too, but not to the same degree as the computers themselves. Why is that? A couple of issues.
Bandwidth requires infrastructure. Moving serious amounts of bits around requires burying fiber-optic cable, and the advancements in that have been poor. And as bandwidth increases, so does the demand. Today’s Internet shepards exponentially more bytes than it did a decade ago, but most of that is in the new market for streaming video. The fact that videos can be added to a page for “free” means they’ve gotten much more prevalent, raising the amount of bandwidth used everywhere.
Latency is an even more difficult problem. Simple speed of light delay is one issue: no amount of technological advancement will allow your website to violate special relativity. This is particularly problematic on the flight I’m on, as the airline’s Internet service uses satellites. Having to send the connection up into space makes the latency much worse than the usual method of connecting to cell towers on the ground.
Of course none of these problems of bandwidth and latency are specific to airplanes; it’s just that the problems become most obvious and acute up here.
As annoying as these issues are, they would be even worse if my browser didn’t have Disconnect installed to block the large number of trackers that have been placed on to most websites. A study by the authors of Ghostery, another anti-tracker plugin, found that over half the time spent loading pages was in the trackers. And there could be oodles of them. The worst offenders had over 100 trackers being loaded, generally because the initial trackers on the site pulled in more of their own.
I’m not going to get into the ethics of web trackers themselves here, but having over 100 of them on a single page is abhorrently wasteful. I tried loading some of the sites that Ghostery called out with tracker-blocking disabled and found that indeed, the sites were completely unusable for several minutes, and not all that much better after they actually loaded. Some continued sending out pointless requests forever even after loading, making the connection worse for everyone else on the plane.
Trackers certainly aren’t the only problems with these sites, to be clear. Talking about all of the issues harming web performance could be a post unto itself. But they are certainly one of the major offenders.
Designing sites for all connections
You might think that the problems of jet-setting software engineers aren’t very relevant in the grand scheme of web design. And you’d be right. But being on a plane just exposes the problem that many of our users have all the time. Remember that as engineers we generally have excellent computers connected to excellent Internet connections.1 Most of our users will have experiences far worse than us. Their problem becomes even greater for those who are far from the US, as most of the servers being connected to and having tracking information uploaded to are here.
An unfortunate tendency in many modern websites is what I call the katamari design pattern, where a bunch of random elements from a wide variety of sources are pieced together to try and get something shipped quickly. This inevitably results in far more resources being used than necessary and unhappy users having to wait forever for your site to be usable.
Sites should be looked at as a whole. Only what is actually needed should be added, and dependencies should be closely watched so that they don’t cause problems of their own.
- Case in point: people complaining about how their ISP “only” gives them 100 megabits of service. ↩