[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]


On Thu, Jul 25, 2013 at 09:01:46PM -0400, tz wrote:
> For the interim, the solution might be to have an extension that
> besides pushing PFS (and alerting when it doesn't work) would cache
> the Cert hashes or more and allow a browser (e.g. firefox) to run with
> all CAs as untrusted, but then do a verification on a per-site basis.
> The big hole in web page security is that there is the web page, then
> there is the extra info like javascript and css.
> So, for example, https://amazon.com might be accepted, but
> https://images-na.cdn.azws.com is in the background ready to rewrite
> the entire page.
> And the page will be broken until you manually "view source" and open
> a link and allow the cert/CA/page for the
> javascript/css/images/metadata.

I've run my primary browser with no trusted CAs, manually TOFUing
certificates for sites, for months on end.  It's slightly easier than
"view source" to use control-shift-K (in Firefox) and reload the page,
then watch for resource load errors in the console.  Some fairly small
adjustments to browser UIs would make this use case much easier.  The
biggest problem is that Firefox's SSL exception implementation only
allows a single certificate per hostname, so load-balanced hosts such as
p.twimg.com which toggle between multiple valid certificates are

(I also VPN this browser through a fairly trusted datacenter, so I'm not
TOFUing over the local WLAN of course.)

It's fairly helpful to use SSL errors as a firewall to help me avoid
accidentally loading sites whose TOS I refuse to accept, such as G+ and

It also functions as a primitive adblock for some sites since you don't
have to accept the certificates for doubleclick.net et al.