Safety standards and civil liability made automobiles safe. It can work for software too.
Imagine every person having to install their own airbags and seatbelts in their cars.
To account for the current level of security in most shipped system, the fictional GM press release should have added that the car would have no locks on the doors, windows that never roll up, and a defective alarm system -- but the hubcaps would be really shiny.
Cars used to be quite insecure, unreliable and dangerous devices to use. When automobiles first came out, many states required that someone walk ahead of a moving car, warning people. For years, many cars did not include basic safety devices like seat belts and structural designs to prevent them from rolling over, exploding, etc. This changed when consumer activists such as Ralph Nader starting advocating for better, more reliable cars and using lawsuits to force manufacturers to redesign them.
Today, we are in a similar situation with computers and their security protections. New systems are regularly released with countless bugs and holes in them, many known before release. The burden is on users to constantly track every known bug, figure out the unknown bugs, and fix them before they are exploited.
It is like Ford designing a car that a twelve-year-old can cause to crash by remote control from his garage using paper clips and an old AM radio.
Thus far, solving the problem has been left to the manufacturers, who are generally absolved of their liability for their failure to write good code through license agreements. This gives the companies only limited incentives to make fundamental improvements. There is more pressure from stockholders and financial analysts to get a new product to market and worry about patches later than to make sure it is secure in the first place.
Sure there are some market forces that favor security: the PR people for the companies have to dodge arrows for a bit when yet another worm, virus or Trojan hits and takes out thousands of computers, Yahoo gets shut down by another DDoS, or NASA gets hacked again by another teenager living in a tent using starving dogs on a treadmill to power his laptop. But nothing seems to change.
It is time to start considering imposing some legal liability when companies release products that have gaping security holes in them.
If the companies are more concerned about getting the product to market than they are about making sure it is a good, reliable and secure product, they should have to pay for the damage that their lapses cause. Why is software, which is now essential for everyday living, not held to the same standard as cars and children's toys? Perhaps a few lawsuits would get the insurance companies, perhaps the only organizations in the U.S. scarier than the CIA and Microsoft, to force them to try a bit harder.
The insurance companies already are working on the user end, writing clauses in their policies on security that may end up influencing which systems companies will use. Already, one insurance company charges fifteen percent higher premiums for using IIS than Apache in its e-commerce policy. As Peter Cassidy, a researcher at ActuariNet, an MIT research project, told me: "For very large users, the cost of the insurance will be factored into decisions about acquiring and using technologies, if the underwriters indirectly punish insecure technologies by applying higher premiums."
That same policy could be applied to companies using Outlook. Sooner or later, those insurance companies are going to want to recoup their losses resulting from bad code. Why should they pay for buggy code that causes them to lose money?
The computer industry has continued insisting that users are completely responsible for the buggy software they purchase by promoting click-wrap contracts and UCITA, a law which absolves the industry of liability. Imagine every person having to install their own airbags and seatbelts in their cars.
Some courts have refused to enforce click-wraps, and UCITA is law in only two states because of opposition from consumer groups and state Attorneys General, but the industry's power is immense and they continue to push liability limits forward. The White House seems unwilling to oppose them: when I asked Richard Clarke, the White House special adviser for cyberspace security about it last week at a forum at MIT, he shrugged his shoulders and cited UCITA as law, clearly uninterested in using his office to demand changes to the status quo.
Now we see efforts by Microsoft to limit dissemination of bug information, which seems more designed to improve corporate PR than security.
Imposing liability and setting minimum standards has greatly increased auto safety in the last thirty years. In 1966, there were 5.5 fatalities per 100 million miles traveled by the American public, according to the consumer watchdog group Public Citizen. By 1999, that ratio had dropped to 1.5 deaths per 100 million miles.
Now I don't expect that legal liability will solve all of the problems. As Public Citizen's auto safety division notes, over 40,000 Americans die on the nation's highways every year. Users still have to be responsible in the same way that drivers are. And plenty of manufacturers still put profits in front of safety and do cold cost evaluations on deaths vs. costs, as we saw with the Ford "exploding-gas-tank" Pinto, and the current investigations into Ford and Firestone over the Ford Explorer's tendency to roll over when it hits a leaf on the road.
But it is time to slay this sacred cow, and start sharing the burden with those who are responsible for it.