It’s clear that Knight’s software was deployed without adequate verification. With a deadline that could not be extended, Knight had to choose between two alternatives: delaying their new system until they had a high degree of confidence in its reliability (possibly resulting in a loss of business to competitors in the interim), or deploying an incompletely verified system and hoping that any bugs would be minor. They did not choose wisely.
via Wall Street and the Mismanagement of Software | Dr Dobb’s.
What is needed is a change in the way that such critical software is developed and deployed. Safety-critical domains such as commercial avionics, where software failure could directly cause or contribute to the loss of human life, have known about this for decades. These industries have produced standards for software certification that heavily emphasize appropriate “life cycle” processes for software development, verification, and quality assurance. A “safety culture” has infused the entire industry, with hazard/safety analysis a key part of the overall process. Until the software has been certified as compliant with the standard, the plane does not fly. The result is an impressive record in practice: no human fatality on a commercial aircraft has been attributed to a software error.