These are a few salient quotes collected about 1970 during the "mainframe" computing era. It is amazing how prescient observant practioners can be about their craft!
- BRIAN RANDELL, University of Newcastle upon Tyne, U.K.
We must inculcate in programmers a disgust for complexity and a pride in simplicity.
- DAVID L. PARKAS, Techhische Hochschule, Darmstadt, Germany
The assumptions which software modules are allowed to make about each other define the interfaces between them. A bad structure is one in which interface assumptions are likely to be false. Modules should be designed to be mutually suspicious.
- FREDERICK P. BROOKS, Univ. of North Carolina
One reason software reliability is a problem is that as we gain new skills, we attempt more difficult tasks. We will always be attempting things just within or just beyond our grasp, and therefore we will always be on the hairy edge of complexity, courting disaster, sometimes failing and sometimes succeeding.
- HARLAND. MILLS, IBM Corporation
Redeeming social value will force development of reliable software, as people choose the reliable products even though they cost more.
(However see previous quote :-)
- WILLIAM A. WULF, Carnegie-Mellon University
In a related area, telephone engineers have learned that effort put into improving coverage (the ability to recover from a component failure) has much greater payoff in improved system reliability than the same effort put into improving component reliability. This point cannot be overstressed it's much more important to be able to recover from failures than to prevent them.
A correct program is one that does what the programmer intends it to do. A reliable program is one that does what the user intends it to do.
I have found that when I am having trouble proving a program correct, I am better off rewriting the program than struggling to complete the proof.
A program should be no more complex than the problem it solves.
- ANDREI P. ERSHOV, Academy of Sciences, USSR
We understand by "reliability" the combined satisfaction of the following conditions:
Conformity: The system does not require special restrictions and conditions which conflict with the modes of operation and the general throughput provided by the hardware configuration and operating system.
Correctness: errors and operational failures in the system must be rare events which do not affect a user's continuing feeling of comfort and satisfaction.
Helpfulness: The system reaction to a user's mistake must contain some suggestions of its nature and, when possible, hints about its correction.
Responsiveness: The information required by the previous paragraph has to be available as a reaction to any physically possible input to the system; each interaction has to advance the user along the path to ultimate success.
Convenience: In typical situations, when the user is certain and confident of his goal, the system does not require unnecessary, unnatural, "ritual" actions on his part.
- GERALD M. WEINBERG, Consultant
Once upon a time there was a utility billing program in which a rate appeared twice, a few pages apart, as a program literal. When the company was granted a rate increase, a maintenance programmer changed only one of the occurrences of the rate literal. After 14 months of observing that revenues had not gone up as much as predicted, the error was found; about $600,000 of revenue had been lost. The company fired the maintenance programmer. Of course, they should have fired the programmer who wrote the program that way in the first place, but by now he was the manager of programming.
Moral: programs should be designed so that one functional change results in one 'and only one program change.