Adventures in SSL – Part II: Integration Strategy
In my first post about SSL integration on my site, I discussed how I came to a decision about a certificate issuer. I chose DigiCert, and have been very happy with them. One great bonus was their extensive list of instructions for setting up the certs on almost any web server known to man. So even though Part II of this series was intended to be about installation, I think DigiCert has that covered. Their instructions for nginx were spot on, so I wouldn’t be able to add anything meaningful to them anyway.
But buying and installing the certificate is a little different than using it. This post will focus on how I integrated the certificate into the site and what additional nginx configuration I had to make to support that strategy.
After kicking it around for a while I realized I really have 2 options. I can either convert the entire site to use https or convert as few pages as possible (e.g. just the login and register pages). The argument for a limited use of https is that all else being equal, the web server will require a little more CPU to encrypt/decrypt the https traffic. This is apparently an issue particularly with nginx as even the creator has said it can drag down performance for high-traffic sites. Since I’m not expecting Amazon-level traffic, this wasn’t as big a deal to me.
Another argument for limiting the use of https is that some low-cost CDNs, such as Amazon CouldFront, don’t support https traffic. This was a concern for me. I will eventually want to move my images, screencasts, stylesheets, and JS files to a CDN, so the fewer https pages I have the less of an issue this would be.
Related to this, some posts I read claimed that browsers will refuse to cache images, CSS, and scripts if they came across https. In my testing with Charles in Firefox and IE on Windows I did not experience that. In other words, any files that could be cached by the brower were cached. Yes, it was a limited test, but it covers a lot of the target base of my app. I believe either this used to be the case and no longer is or it’s one of those old wive’s tales that people just assume is the case but have never really taken the time to test.
I saw a couple of benefits for using https for the whole site. The first was that it simplified my application architecture. For instance, say you have a login page that’s intended to be served over https but it includes a common header image that’s present on all pages. That image has to also be served over https on the login page or the user will get a popup warning message that the page contains both secure and insecure content. That message is at least annoying if not scary to some users, so it’s best to avoid it by ensuring that the image is served up via https. But that means you may have a situation where you have 2 copies of that image so that it can be served up by both https and http. Or your configuration might become more complex in order to support 2 virtual servers pointing at the same image file on disk. Either way it’s a complicating factor that I wasn’t thrilled about wasting time on. If the entire site is served over https this issue goes away.
Secondly, it would be easier to configure than having only some pages be served via https. For instance, let’s say the login page is https. If someone asks for that page via http, the server should be nice and redirect them to https. But for almost all other pages it should allow regular http requests to process normally. These exceptions are easy to handle for one or two pages, but for more than a couple that quickly becomes difficult to manage effectively.
Lastly, my application is targeted at kids in the 10 to 15 years old range. For me, the more security the better. As with any site that relies on cookies to identify logged in users, it’s theoretically possible to hijack someone’s session via the cookie value, and if that were to happen it would lead to some seriously bad press for me. Again, if the entire site is accessed over https this issue goes away.
So as you can probably guess, I decided to serve the entire site over https. The big question I haven’t answered here is what effects this had on performance. I’ll discuss that the final installment in this series. But for those also using nginx, below is an excerpt of the config changes I made to support this. It should be self-explanatory, but leave me a comment if you need any help through it.
# non-secure site - send all requests to https server { server_name www.mysite.com mysite.com; listen 80; location / { rewrite ^/(.*)$ https://www.mysite.com/$1 permanent; } } # secure site server { server_name www.mysite.com mysite.com; listen 443; ssl on; ssl_certificate /path/to/pem/file; ssl_certificate_key /path/to/key/file; ..... }