A recent survey revealed only 45% of the top 1 million sites have HTTPS enabled. The number is even lower if you consider that most of those sites are not available using HTTPS for all the content, but just a small bit, like the login page.
So what types of sites should use HTTPS all the time?
Banks, as you might expect, use HTTPS for their online banking. However, many users will click through from the information part of the site, which typically won’t use HTTPS. This creates a problem, a man in the middle could change that link to point at another site, and really would you notice the domain name was slightly different? Most of the big UK banks are getting there now, however, Lloyds still doesn’t allow HTTPS browsing of the main content - meaning you can’t really trust the online banking link unless you are very careful. I think the whole information site from a bank should be using HTTPS if only to provide a trustworthy link to the online banking.
A few years ago it was common for search engines to not offer any kind of HTTPS access. At first this seems sensible, the results you get back are public and if you aren’t logged in, what’s the issue? The problem is what you search for, I think for most people, what they search for is private, people typically are giving away quite personal information to search engines, for kids this might be questions common in people growing up, in adults this might be questions about health that we don’t want to disclose to the world. What if we are using a public WiFi connection? others using that connection could find out what we are searching for, and embarrass or even extort us using the information we search for. Thankfully Google is now using HTTPS. Bing is now available as HTTPS, though unless you remember to change the URL, you won’t be using it. DuckDuckGo uses HTTPS by default and redirects if you try to use HTTP.
The first two examples were more obvious examples where HTTPS is needed, what about a news site? we don’t login, the information is public do we HTTPS here? The BBC is a big brand for news worldwide, and people should know if it is being censored. Without HTTPS there is no way for citizens in China, North Korea, Egypt and other places, to know that negative stories aren’t being carefully removed or altered by censors. Unfortunately trying to browse www.bbc.co.uk/news using HTTPS, just redirects you back to the HTTP site. Trying to do the same with the Guardian tells you the certificate is invalid - and if you look at the certificate (using firefox for example) you’ll find a big list of Fastly’s customers (the CDN the guardian seems to use).
What about a site that offers downloads of free software? essentially again the information is public, but can you trust running software on your computer that could have been changed by a man the middle? For example if you were the NSA and you wanted to spy on someone, wouldn’t an easy option be wait insert your spyware into any executable they download? Of course there is code signing and checksums, though somehow I doubt most non-technical users check the checksums and most software is not signed anyway. GitHub gets this right, HTTPS all the time. Unfortunately sourceforge & download.com get this wrong. The PHP, Python and Ruby sites all fail to give you a HTTPS link to actually download the software.
Social Networks / Web-Apps
There are whole slew of sites that let you create some data and share or not share it with people. Often this data is fairly private outside our friends, so this should be protected from rogue wireless points, governments and the like. Facebook and Twitter now do this, Linkedin redirects back to HTTP straight after you login, as does RunKeeper.
So basically I think all sites should use HTTPS. In the web world we have various best practices, for example don’t use tables for layout, don’t write sites that only work in IE, or require flash. Making a site HTTPS should be one of these. I would be shocked if I saw sys admin using telnet in the 21st century and this should be true of making a non-HTTPS website.