Eric Meyer recently made a provocative post titled Securing Web Sites Made Them Less Accessible making the case that the push for getting HTTPS everywhere on the web has harmed the accessibility of many sites. The main argument is this: poor Internet connections like that used by students in rural Uganda have low bandwidth, data caps, and very high latency. Getting around these problems requires a local caching server like Squid, and HTTPS prevents the caches from working. Therefore HTTPS harms accessibility. And there is a good point there. I too have noticed the problems that latency causes on the web, and specifically saw time being eaten up establishing the TLS connections on my last flight. Connection latency is a big problem in web usability and accessibility. But I think in the end the argument falls flat.

The first problem is that local caching servers for secured traffic are possible, they just take a bit more work to set up. Many corporate networks require all traffic to go through a proxy, and HTTPS traffic is diverted by placing a corporate certificate authority on the computers and having the proxy masquerade as the domains being talked to. Such a thing could be set up on the local server, and the end-user computers could either have the CA easily installed or are old enough that certificate failures aren’t a serious issue for usability. If they are really old, certificate validation could even be disabled altogether, allowing the use of a naïve cache that just sends a self-signed certificate. Instead he suggests per-site JS service workers to do some local caching. I don’t see how that would help given that it’s still per-computer caching while per-area caching is what’s needed to really benefit the students.

The other problem is failing to see how security and accessibility tightly relate to each other. Banking and other websites may as well not exist if they haven’t been properly secured; in fact they’re an attractive nuisance. If my bank can’t even offer a modicum of security when using their website, I’ll go somewhere else. HTTPS and the push for better security on the web in general has made these sites more secure not just in terms of content security but also authentication (2-factor and such). There’s still plenty of room for improvement but there’d be much more without the push. Of course this probably doesn’t apply to the students but it does apply to many people claimed to be harmed by the security push.

National Security Agency slide showing bypassing SSL on Google products
Anything sent via HTTP may as well have been skywritten.
Image credit:

Other websites may seem less important on the surface, but that’s not going to be true for everyone. Even if the content isn’t as vital as banking information, most of us don’t want what we’re looking at arbitrarily spied upon. Unfortunately anything sent over plaintext is just that. Governments and ISPs1 around the world are all sniffing whatever they can get their hands on. Beyond that sort of impersonal surveillance, many users are in more personally dangerous situations. Content that seems fine to you may get them arrested, imprisoned, or even killed. Meyer made a good point that we should be thinking of more of our users, but that applies just as much to ones who need high security as it does to the ones with bad connections.

Pushing everything onto HTTPS also creates a sort of herd immunity. It makes spying on the remaining unencrypted traffic less enticing for someone to undertake simply because there’s less of it. It makes telling what sort of traffic is important more difficult because when everything is encrypted, everything looks the same (except for DNS requests). It helps the website owner know that the pages they send haven’t been tampered with in transit to add tracking headers, ads, or anything else. Finally, unlike what Meyer said, it does make HTTPS “faster, cheaper, and easier for everyone”. It’s become so easy it’s part of the standard process for many web hosts, and free too. All the things it brings improve accessibility once certificate issues are taken care of.

Using the web from a bad connection is no picnic, to be sure. But there are many things that can be done without abandoning HTTPS or setting up pointless per-site service workers. Besides local caching with local certificates or disabling verification, another easy thing to do is installing adblockers on the computers being used. Whatever you think about them in general, it’s hard to condemn Ugandan students for using them to save on their data usage. As Maciej Cegłowski so cutely pointed out in his presentation The Website Obesity Crisis, ads are one of the biggest bandwidth hogs out there. The resources they use, unlike the ones on the pages themselves, often can’t be meaningfully be cached as well. These and other fixes can go a long way to making the web more usable on limited connections like satellite links.

  1. By the way don’t expect a VPN to save you; they more than anyone else have a strong incentive to secretly surveil.