The DARPA-funded security auditing project was done in by its own obscurity, and some misconceptions about what security researchers really want.
Developers who discover vulnerabilities in software are most interested in fixing the problem and merging the new code into the latest revision.
The Sardonix project was born from the successes and eventual failure of the Linux Security Auditing Project (LSAP). Through it's design Sardonix encouraged the use of an OpenBSD-style software auditing process. This process involves researchers auditing software packages on a file-by-file basis. The purpose of the audit is to look for and locate basic programming errors that may or may not have software security implications. When the audit by one researcher has been completed, the next researcher initiates an audit of the software using the same process.
Sardonix's innovation was to create a hall of fame for security researchers, acting as a long-lasting and credible forum from which members could prove that they do in fact possess security auditing skills. The proof would come in the form of a rating system that gives the auditor a higher rating if subsequent audits proved he or she located all the bugs in the code reviewed, and gives the auditor a lower rating if other audits located bugs the researcher had overlooked.
The project aimed at finally making real the nebulous claim that open-source software is more secure than closed-source, because anybody can audit the source code of the program for security issues.
As I've said before, open-source software has an advantage in that anybody can audit the code. However, auditing for security issues happens infrequently at best. When software is developed, the focus is on just making sure it actually builds and runs. If security were a priority, production software would never have vulnerabilities, and we'd never worry about holes in software packages like Apache, Sendmail, or basically anything written in PHP.
Doomed by Obscurity
But even with some of the brightest minds in the security community managing the Sardonix, there were a number of important ingredients altogether missing from the project. The first is sex appeal.
Sardonix wasn't exactly spoken of with awe and excitement in security and developer circles. While the concept of tracking the security audits of software and creating a history of discovered vulnerabilities is good, the reward mechanism contemplated by Sardonix was ill-conceived. More often than not, when developers discover a vulnerability they're not interested in receiving "props" for it -- they may not have even been looking for security holes to begin with.
Developers who discover vulnerabilities in software are more interested in fixing the problem and merging the new code into the latest software revision, through a software management system like CVS or SourceForge. Sardonix didn't integrate with these systems, which gave it the sex appeal of a pig wearing fishnet stockings.
But the biggest hurdle for Sardonix wasn't its minor technological shortcomings; it was the community.
In discussing Sardonix's fall, Crispin Cowan, the project's leader, was harsh in his appraisal of the open-source community: he asserted that security researchers were only interested in finding splashy bugs and posting them to security mailing lists. I think that's oversimplifying the matter.
Members of the security community tend to audit software either for a business interest, or for their own private use. Some of them do, indeed, disclose the issues they discover to mailing lists, winning reputations as Kung Fu masters. But many sit on their findings, because kudos on a mailing list or a software auditing website can never compare to the reward of unauthorized access to a high-profile system.
Sardonix had nothing to offer either variety of auditor.
And any chance the project might have had at recruiting a third kind of researcher was thwarted by its own obscurity: the project leaders simply did not do enough to get the Sardonix name out and advertise to the security community.
The success of Sardonix would have proved a key argument that open-source advocates have used to lend validity to the cause since time immemorial: that open-source software is more secure because the source is available to the world to be audited. The project's failure is a reminder that the statement is a myth.