BSides Detroit 12 Interviews 11

his week Wolfgang and I are joined by a rogue. @Rogueclown, that is.  You might remember Nicolle Neulist from the Rats and Rogues Panel episode back in January. This time Nicolle is talking about HSTS, or HTTP Strict Transport Security. Since the interview, she’s gone on do do more research, which means her talk will be all that better.

Abstract: Although not widely implemented on websites yet, HTTP Strict Transport Security is a standards-track web security mechanism designed to allow a website to force a browser to only accept the site if it is delivered over SSL/TLS. HSTS has been implemented on two of the three most widely used browsers: Firefox and Chrome. It is designed as a transparent way to protect users from sniffing attacks and spoofed pages, by forcing SSL/TLS on pages that should have it. This presentation explores the design of HSTS as well as its implementation. Basically, all HSTS contains is one header telling the browser to only accept the site over a secure and trusted connection for a certain time. Except for a few dozen sites, for which HSTS protection has been hard-coded into Chrome at the request of the website owners, HSTS data for both Firefox and Chrome is saved in a user’s profile. If a user tries to load a site in the browser’s HSTS database, and the site is delivered either over plaintext or with a bad certificate, the browser returns an error that the site is not available. HSTS is designed to be transparent to the user — which is good for keeping users off some malicious sites, but can also be dangerous in the sense that it is so easy to take away a protection that a user doesn’t even know is there. One common way in which HSTS is mis-implemented by webmasters is by putting HSTS headers on a subdomain (www.site.com) without putting one on the website at the main domain (site.com) — even if site.com only serves as a redirect to https://www.site.com. Even with HSTS in place, and the database knowing that www.site.com should be accessed securely, a user who only types site.com could access a malicious site. I will show a demonstration of this in a VM lab with a rogue DNS server provided by a DHCP server as the attack vector; DNSChanger malware, however, would work just as well. This can be straightforwardly addressed by webmasters, by placing an HSTS header at site.com with subdomain permissions enabled, or adding HSTS headers to all pages at all domains and subdomains. Another implementation flaw involves the threat of an attacker adulterating or deleting a browser’s HSTS database. Since HSTS is transparent, a user is unlikely to notice if the database has been tampered with. I will demonstrate and share code (written as a Metasploit module for the sake of community usability) that will remove the HSTS databases for both Firefox and Chrome, as well as continue to do so in the future — leaving the user vulnerable to accessing malicious sites posting with “legitimate” domain names when using a rogue DNS server. Although with root privileges this can be done for all users, even with user-level privileges this can be set to persistently break any given user’s HSTS protection. Hard-coding, as Chrome does, may work for small amounts of sites, but may not be scalable as more sites adopt HSTS.

This episode is cross-posted at Rats and Rogues.