I’ve spoken to some very influencal people of late, one of whom happened to be Mark Shuttleworth today at the Ubuntu down under conference. He had one particular concern about control of domain certificates, and that he didn’t feel comfortable including our current root certificate at present until we either stopped issuing them under our present root (ie setup another root certificate for assured certificates, or start issuing unassured certificates from a new root), or alternatively we can just not issue them to unassured people.
Also worth mentioning that at one point webtrust certification was mentioned but he wasn’t really that concerned about it, he was more worried about the security (or insecurity) of control of domain type certificates.
This isn’t the first time that it’s been suggested that we alter how many root certs we operate and under what conditions people are allowed to issue from which certificate, at this point in time it’s a difficult decision to make and we’re looking to the community for feedback on the issue (as this will effect a lot of people no matter what happens) and what the best course of action to take is.
Comments on this are important!
* one possible solution might be to issue a new root cert signed by the current root cert (since this issue only effects server certificates) that way it should work with least amount of impact to most/all people.
The radical/complete? solution is to start with one Root CA properly identified and named as THE CAcert Root on a file system that is secured from outside influence. Then generate a number of secondary CA with at least one per category of user and site. This then allows any sites to drop authorization levels for the secondary CA categories that they are reluctant to trust. Without this or some other means to differentiate the various levels of users and domains there will always be question of trust. This will require change across the entire space but could be activated for new certificates immediately with the old unassured domains expiring naturally after their six month life.
The simple solution is to establish a set of secondary CAs for ALL unassured certificates immediately and allow the currently issued unassured domains to expire naturally after their six month life.
Actually mozilla foundations reaction to multiple certs recently was to be against them, although our initial problem was more the issue to us specifically. It’s difficult to get others to import one root cert, let alone many of them, without having intermeditry certificates already in software things like client certs start to be a royal pain because it needs to build up the complete chain to sign emails with.
In any case, having a single root and then issuing sub-roots for different purposes wouldn’t help in this specific situation either because only one of the sub-roots would be imported, not the main root, since people could chain to get unassured certs issued still.
I say that the value of inclusion in Firefox Ubuntu exceeds the value of the currently issued Domain Cert’s. My suggestion would be to revoke them all and put the effort in the “assured” process. If by revoking them all, the current “root cert” would be acceptable then I suggest this would be the best solution. I assume that the process of getting points with the “webtrust” that someone would be assured to a level that would be acceptable to issue domain certs.
Most users do not understand all the warning messages about “expired” or “untrusted” certs. So, will likely cause no more confusion than they already experience.
Let’s make sure we don’t do something to make ubuntu happy, if it’s going to anger mozilla: mozilla > ubuntu. Other than that; since unassured cert’s can’t be “upgraded” (they have to be replaced once you have enough points) I would have no problem issuing unassured cert’s from a lesser root. The unassured cert’s should clearly say something like: “Low Trust (class 1) Certificate, learn more at http://cacert.org” As I keep pointing-out (not just to annoy Duane), the US Federal Govt. would consider our assured cert’s to be class 4 (face to face with 2 ID’s, no background check required).
I don’t know much about ssl certificates at this time except that it’s “supposed” to cost extra money.
But then how come some places say it’s for “free”??
But then I think the real question to ask is like “to be or not to be”
WHY in the world should ssl certificates cost extra money?? Isn’t that a natural thing that is SUPPOSED to be clearly a right that people should have from the start. Right? A “SECURE” web site! OH my isn’t that a stretch of imagination! And at a Premium cost too!
I think for now that it’s a scam that benefits those companies that have insider knowledge about how to do it. Is it that hard to learn how to set it up yourself? Must be secret knowledge like how to make an A-bomb to make your own ssl certificates.
Surely there must be another FREE way just to make a secure ssl connection between two computers!
And finally bottom line of what I think is that SSL Certificates are a scam and it should be free for everybody. It is the people’s common right to have secure web sites! Boston Tea party like revolution against paid certificates should be next!
Personally I think the Asuurance part for server cert’s should be removed anyways and use a system like call back and voice auth like most certs are issued.
Check the domain owner and use some sort of auth other then forcing a user to find a way to get assured. Myself I just don’t have the time to do that here. I know I’m not the only one in this situation.
It’s just a thought so don’t go flaming me here for it.
Reaction to tirekick: In an ideal world there is no crime and then no police with related costs is necessary. To have secure sites we need secure certificates and thus a secure form of authenciation is necessary which cost (somebody) money.