Author Archives: Duane

Pushing Ahead

I think one of the key things that will help push things forward is and always has been our web of trust, while on the surface it seems like just a way to unlock features on the website such as increasing the length of time certificates are valid, or getting your name/company details onto certificates. However in my opinion the big advantage, which is currently largely unrealised, that our web of trust over most other internet identity schemes is that it is technology neutral, and this gives us an advantage over things like the PGP web of trust which is locked to a specific technology. If at any time in the future they move away from PKI and x509 certificates we can simply integrate the technology into our system and keep going like we always have been.

Also an insightful comment submitted on the previous post suggested that if we managed to convert 10% of self signed websites, we’ll be well on our way to having a lot more mind share, and I guess this is where things like inclusion in the Debian ca-certificates package start to become significant, because this can then be leveraged to make it easier for certificate chaining once we start offering the option to have certificates signed by a higher trust certificate for those with 50 points or more, which hopefully will be within the next 24 hours all going well!

So I guess this begs the question if anyone have any contacts with any distros or other high profile sites using self signing, that we can talk to or if you’re able to convince them to start getting their certs signed by us it would push things along that little bit further that little bit quicker, and of course getting assured will improve the over all trust of the network, especially the more inter-twined it becomes.

The battle for inclusion heats up

While our progress with the Mozilla Foundation is currently (and has always been for that matter) been in go slow mode we have been making progress on other fronts. Of late a number of distributions have either included us (such as Knoppix) or are putting considerable thought into including us (such as Ubuntu), with Debian the latest distribution to include us.

While this may not seem like much to the nay sayers (I still get told no one will ever use CAcert because it’s not included, which I think is strange), it shows that we are getting more widely accepted and more to the point, gaining credability.

Conference – MilLANium 2005

http://www.millanium.at/ May 20-22, Vienna, Austria. Some Assurers will also be on MilLANium, a LAN-Party based in Vienna’s Gasometer. Please send a mail before to cyberkov at quintessenz.org.

Unassured Certificates Updates

I really don’t think a consensus was reached over the issue of what to do with unassured people issuing certificates. Long term my thoughts were to disable most functions from anyone not assured, but that requires some kind of critical mass which I just don’t think we have at present.

I think intent plays a big part here, by that I mean if you are utilising CAcert for certificates to secure emails, or even just IRC you’re more likely to stop using CAcert and go back to self signing if you’re not assured, where as those pushing to stop having unassured people issue certificates tend to fall into one of two camps, those assured, or those that think SSL can in some way greatly increase security, and I guess to an extent those already assured are perhaps leaning towards the latter as well.

So I guess here is my challenge to those wanting for a more secure internet, get out there and start assuring more people, because this then leads to critical mass which allows us the ability of making decisions based on merit rather then knee jerk.

Conflicts of interest

What seems like a relatively simple thing in the SSH world, a simple warning about fingerprints changing, you think would be a very invaluable thing in the SSL world as well. The reason this should be important is because of a simple problem, Verisign has the potential to do a lot of things after all they’re the ones left guarding a couple of root name servers (which controls the entire DNS system), they also directly have control over several TLD zones (.com/.net) and to top things off they have their root certificates in every browser out there.

However that’s just the tip of the ice berg, Verisign is also one of the largest providers of snoop services for the US Government, and anyone else able to afford their services, and this is where a warning would come in handy.

If Verisign decided to redirect your traffic from from your website by making their name servers authoritative for your domain and then point your host name to their proxy server which in turn relays your traffic to your proper website, they could. But wait, you’re using SSL on your web-mail so no one can listen in to your passwords and/or emails, right? WRONG! This is where my earlier point about Verisign having their root certificates in browsers comes in, they can issue a new certificate for their proxy server for say “*” (a wild card for any domain/host name/everything) and if you connect to HTTPS the browser will never warn you that you are hitting a different certificate, and that all your traffic is now being proxied, captured, proded, poked and anything else you can think of.

What’s that I hear you say, that it’s illegal to intercept traffic like that, think again, the NSA is doing it to all traffic going across US international borders and have said that any and all is fair game to it. Not to mention the US has some interesting laws that they could force Verisign into doing it, then throw in a gag order for good measure!

So back to my original statement, if browsers cached and checked that certificates haven’t changed, then any kind of interception like this would then cause the user to be warned in an active sense, rather then the passive method of trying to view who issued a certificate.

This then leads to an interesting dilemma for browser developers, if you on one hand state all traffic that goes via SSL/TLS is encrypted and can’t read by others, and then on the other hand it really can, doesn’t that make you liable for making misleading statements?

The guys working on Mozilla software doesn’t seem to think so, nor do they seem to think it is a good idea to actually provide security for their users, they’d rather you lived under a rock and think commercial certificate authorities are such perfect entities. So far I’ve been met by nothing but a wall of silence on their news groups when the topic comes up, and when I filed a bug report, it was hastily marked as invalid.

I’ve said it before and I’ll say it again, I trust SSL in most browsers as far as my credit card number, I certainly wouldn’t be trusting it if I were a Chinese dissident!

CAcert now has a fully functioning OCSP setup

CAcert now has an operational OpenCA (RFC compliant) OCSP Responder. All the certificates that were issued after 2005-05-16 should have the OCSP Service URL automatically included, and your OCSP client should check periodically for certificate status. The OCSP responder issues signed responses over http, (the OCSP address is not a normal website and you can only connect to it with an OCSP client, such as OpenSSL) once your client is running you can tell it to connect to http://ocsp.cacert.org or http://ocsp.cacert.org:2560.

To activate OCSP in firefox use the below settings.

  • Click on the Tools menu, then select Options.
  • After the Options window appears, select Advanced.
  • Scroll down until you get to the Validation section, OCSP will be the last option.
  • By default “Do not use OCSP for certificate Validation” is selected. Change to the second option, “Use OCSP to validate only certificates that specify an OCSP service URL”.
  • Click OK to close the Options window.

We run our OCSP Responder on port 2560 (OpenCA default), we also make this available as a vHost in Apache on port 80, which will be important for anyone stuck behind a firewall and unable to connect to ports other then 80 or 443.

We now issue all certificates with the OCSP Responder URI address listed as http://ocsp.cacert.org and we plan to distributed servers around the world via round-robin DNS. If/when load or bandwidth become a problem in the future, we can simply add more OCSP responders in a similar fashion to adding secondary name servers (DNS), and it would seem things could easily be made highly distributed with our current configuration.

Due to the threat model used in developing the RFC for OCSP, high availability will be a key issue in running/maintaining OCSP services. Any clients with OCSP turned on will fail to connect to any site whose certificate doesn’t have a valid OCSP response. As far as I’m aware mozilla products currently do not have any form of OCSP caching, so reading signed/encrypted email on a plane in Thunderbird could be difficult at this point in time. Microsoft is apparently developing an OCSP client for it’s next version of Windows/MSIE that apparently does some caching, although it will be interesting to see how well this works.

Domains with CAcert in them.

A question came up the other day, what policy do we have about 3rd parties registering domains that contain the word “CAcert”. In the past few days I know of a few more domains either registered or applied for, and while I can assume (hopefully correctly) most people will do the right thing and they are applying for them to either point back to our main website or to localised content and support, there is of course like everything else in life the potential for abuse.

Up until now there has been reasonably small risk of anything bad occurring, but as things progress this is something that should be taken serious as it will reduce efforts to market ourselves, as these sites unless labelled correctly could be mistaken as an official off shoot, when in actual fact only the main website is the only site we officially operate.

The mozilla guys have recently published their trademark policy, and one clause included an item on domain names that contain mozilla trademarks.

If you want to include all or part of a Mozilla trademark in a domain name, you have to receive written permission from Mozilla. People naturally associate domain names with organizations whose names sound similar. Almost any use of a Mozilla trademark in a domain name is likely to confuse consumers, thus running afoul of the overarching requirement that any use of a Mozilla trademark be non-confusing. If you would like to build a Mozilla, Firefox Internet browser or Thunderbird e-mail client promotional site for your region, we encourage you to join an existing official localization project.

To receive written permission, please download and follow the directions as outlined in the Domain Name License.

There are a lot of examples out there of other community projects spawning domain names based on the original project for localisation/regionalisation purposes, and in at least one case, the Plone Foundation requires all domains to be handed over to them.

The simple solution might be to offer <country code>.CAcert.org (such as br.CAcert.org) or one of our other domains (.com/.net) and ask everyone nicely to refrain from purchasing confusing domains and to request a sub-domain from us instead.

I’m guessing we need to start thinking about official policies on other things as well that might be used against the spirit in which they were originally created.

All in the name of marketing

So you’re a commercial certificate authority and you’re looking to provide an edge over other companies doing the same thing, so why not offer some kind of insurance!

Well that’s exactly what Godaddy has done, they’re offering US$1000 warranty, but it’s the same snake oil warranty that most other CAs offer.

So anyway, Gerv from the Mozilla foundation makes this nice little post to the mozilla news group today about how he couldn’t find out from their website exactly what it covers, so he decides to phone them up and ask them.

Long story short, the comments made by the sales representative says it all:

“Well say, for example, I own www.happycompany.com and I have a Verisign certificate. Then, a fraudster registers www.happy-company.com, gets a certificate from you and rips off my customers. Is that situation covered? Would you pay out?”

“Well, no. You see, we’re not securing you, we’re securing the other guy. You have to be registered with us.”

and this;

“Have you ever paid out under the warranty program?”

“No. It’s really there just to reassure you that it’s a true 128-bit certificate, and to make you feel better about purchasing it.”

I’m really not all that surprised by this I guess, as this is the same snake oil that’s been pushed and marketed all along really.

Do users care about pop-up warnings?

I’ve seen a couple of interesting things lately, firstly a post on one of the mozilla newsgroups explaining how little people are really educated about security in general, and pop-up warnings more specifically, and how it takes a lot of time and effort to get people to actually think before they act when a warning pops up and tries to get some useful feed back.

The story to the mozilla group went along the lines of, user gets windows computer infected, computer literate friend reformats computer and installs zone alarm et al and tells computer user to only click ok on warning messages that pop up directly after you run a program. Computer user gets re-infected and computer friend asks but didn’t you use zone alarm correctly, to which the computer user replies “Yes, I clicked ok every time a warning came up”.

Moral to this story is a little education can go a long way, or alternatively just use a Mac or linux and problem solved.

Next up a link just sent to me about an online banking server (within a server farm) in New Zealand that was transmitting an expired certificate for about 11 hours, after trawling through their logs they found, out of 300 users that potentially received pop-up warnings, only 1 user refused to continue using the website. The bank in the article tried to down play to incident, saying that most people possibly saw that the warning was for an out of date certificate and the users correctly assumed very little was wrong. I think the paper doing the article should have really gone to town berating both the bank for letting this happen and for the end users, while correct this time, for simply clicking through a warning. With all the phishing scams, and people being stupid enough to let themselves get ripped off left, right and center you’d think the rest of society would have gained a clue by now, but that just doesn’t seem like it’s going to happen any time soon with all the manually user installed viruses doing the rounds.

In reality this is nothing new, after all the people that get infected time and time again generally don’t care, and this will continue to happen until they’re forced to care, usually when they loose their bank/credit card information to some scammer, then they will be screaming blue murder about how they weren’t protected when in actual fact they’re not pro-actively doing enough to protect themselves. People pro-actively protect themselves in their day to day lives from mugging (ie not walking down a dark alley in the middle of the night), it’s just a pity the analogies don’t quite transfer though I guess. Actually the internet equivalent here is having a policeman on the alley saying I wouldn’t go any further if I were you, and they keep going anyway.

Now properly supporting subjectAltName

I’m pleased to announce that we finally worked out the correct way to not only issue certificate requests with subjectAltName (SAN) extensions, but have certificates issued with the correct SAN extensions and this is important for a number of reasons. The reason it took so long to actually get this implemented correctly was due to the poor and misleading documentation for OpenSSL on the topic, as well as the fact that the proper extension in the RFC is dNSName and OpenSSL implemented it in configuration as DNS instead.

It would seem at first glance both MSIE and Firefox both support SANs correctly, so this means you can have multiple host names, even from completely separate domains and it will simply just work. We’re still working on setting up hosts for a full compliment of browser testing, but you’d have to assume other browsers for the most part should also support this feature.

The other thing of note is multiple commonNames are ignored on certificates, only the first one is accepted and used, so if you want to do anything other then wild cards, this is a tad limiting. Someone also sent me a short perl script that can be used to easily generate valid certificate requests with SAN extensions. I’m also contemplating ignoring multiple commonNames and just issuing certificates for the primary commonName, the commonName is ignored if any valid SAN extensions are on the certificate are present, so if you want to include the host you also have in the commonName this has to be also listed as a SAN (and I’m sure this will catch a few people out).

While this is useful for a single website with SSL, we’re not sure if apache or other server software will allow multiple vhost entries to share the same certificate and do all the handshaking properly, and neither apache nor mozilla browsers implement/use the TLS handshaking ability to utilise multiple certificates from multiple vhost entries. It’s also highly useful for mail servers that are known by multiple host names, such as having an interface on the inside of a corporate network, and also having an external interface.

We’ll keep our wiki page on this up to date as we learn more, or as new code gets added to browsers etc…