by William A Dembski. Cambridge (UK): Cambridge University Press. 243 p.
Reviewed by Wesley R Elsberry, Department of Wildlife & Fisheries Sciences, Texas A&M University.
In an article appearing in the October 1998 First Things, William A Dembski announced the existence of rigorous and reliable means of detecting the action of an intelligent agent. Its description and justification, said Dembski, would be found in the pages of his new book, The Design Inference (TDI). Dembski made a special point of applying this criterion, which he called complexity-specification, to biological phenomena, with the claim that biologists must now admit design into their science. Dembski's TDI is a slim and scholarly volume, as one expects from a distinguished academic press. Dembski employs clear writing, illustrative examples, and cogent argumentation. The work, though, is motivated and informed by an anti-evolutionary impulse, and its flaws appear to follow from the need to achieve an anti-evolutionary aim. The anti-evolutionary bent is not as overt here, though, as it is in other works by Dembski and fellow Discovery Institute "Center for the Renewal of Science and Culture" colleagues Phillip Johnson, Michael Behe, Paul Nelson, and Stephen Meyer. The closest that Dembski comes within the pages of TDI to explicitly staking out a position on evolutionary issues is in Section 2.3, where a "case study" is made of "the creation-evolution controversy". In it, Dembski accuses evolutionary biologists of rejecting one or more premises of his design inference in order to avoid coming to a conclusion of design for biological phenomena.
The "design inference" of the book's title is an argument to establish that certain events are due to and must be explained with reference to design. Dembski crafts his argument as a process of elimination. From the set of all possible explanations, he first eliminates the explanatory categories of regularity and chance; then whatever is left is by definition design. Since all three categories complete the set, design is the set-theoretical complement of regularity and chance.
Dembski's book and major concept share a name, The Design Inference. The Design Inference is an argument which leads to a conclusion of design for an event. Dembski deploys a large number of terms and phrases in making his argument that design must be recognized as a necessary mode of explanation in science. Fortunately, Dembski is also scrupulous in making clear what each term means, even when it has a common or casual usage. Design is one of those terms, and it becomes a category defined by the elimination of events that can be attributed to regularity or to chance. Complexity-specification is a term used by Dembski in other works to describe the diagnostic attribute of design in an event. It derives from Dembski's earlier use of the phrase, Complex Specified Information. The idea behind complexity-specification is that the jointly-held attributes of complexity, as small probability, and specification, as conforming to an independently-given pattern, reveal the presence of design in an event. Complexity excludes high- and intermediate-probability events, and specification excludes chance events. Since regularity comprises events marked by high probability, complexity-specification then yields those events that fall into the exclusionary category of design as Dembksi uses the term. Since these three categories (regularity, chance, and design) embrace all events, and design is established by elimination of the other two categories, design is thus the set-theoretical complement of regularity and chance.
The Design Inference is a deductive argument which can lead to the recognition of complexity-specification, and thus design, for a particular event.
Dembski often talks about unpacking terms, and the "unpacked" forms of the 3 categories at issue need to be kept clear. Regularity is simply any event with high probability. Chance is any event with intermediate or small probability, but for which no specification exists. And design is any event with both a small probability and a specification. These are the defining criteria of Dembski's categories. A conclusion of design for an event means that the event is not of high probability, intermediate probability, or small probability without a specification. Dembski's usage of the phrase "due to" also is somewhat different from standard (contrast with footnote on p 48). Unpacking "E is due to design" results in: "The proper mode of explanation for E is the negation of currently known regularity and chance."
The elimination of chance as a category of explanation comprises most of the book. Chance is acceptable to Dembski as an explanation for all events of intermediate probability and also for certain events of small probability. However chance is excluded for events which both have small probability and conform to a pattern that can be given independently of the event. Such an independently stated pattern is called a specification by Dembski.
Dembski illustrates the meaning of specifications or "good patterns" which indicate we can reject chance explanations and fabrications or "bad patterns" which do not distinguish chance events from those due to other explanations. If an archer fires an arrow at a wall and plants it in a previously-painted bull's eye, the bull's eye represents a specification, and the event of the arrow's hitting the target tells us that the archer has a high level of skill. If another archer fires an arrow at the wall, then takes a bucket of paint and draws a bull's eye around his already-implanted arrow, that bull's eye represents a fabrication, an ad-hoc and after-the-fact pattern that gives no information about the event with which it is associated.
Another claim of Dembski's is also problematic, and that is the claim that his Explanatory Filter encapsulates the process of how humans ordinarily detect design, whether that design is attributed to humans, other animals, or extra-terrestrial intelligences. Complete with flowchart (p 37), the explanatory filter has 3 decision nodes. At the first, if an event is deemed to have high probability, it is classified as due to a regularity, or rather that the proper explanatory mode for the event in question is regularity or law-like physical processes. An as-yet unclassified event then moves on the second decision node. If it has intermediate probability, it is classified as due to chance. Events that are still not classified then move on to the third decision node. If the event both has a small probability and also conforms to a specification, it is classified as due to design; otherwise it is classified as due to chance.
Dembski makes various forms of the same argument, showing that deduction leads ineluctably and conclusively to certain events' being due to design. The catch is that Dembski is using his own definition of design, where design is simply the explanation that remains after chance and regularity are eliminated. This is touted by Dembski as an advantage for the purposes of his argumentation, since he avoids attributing either causal stories or the intervention of intelligent agency a priori. In no fewer than 3 separate passages in TDI (p 8, 36, 226-7), Dembski assures the reader that the design of TDI does not imply agency.
Design and Designer
One may wonder what TDI was supposed to accomplish, if design no longer means what Paley meant by it and the attribution of agency no longer follows from finding design. But Dembski believes that finding design does imply agency, even though he has identified that implication as being unnecessary. In his view, because we can often find that design is found where an intelligent agent has acted, we can reliably infer that when we find design, we have also found evidence of the action of an intelligent agent. Section 2.4 gives Dembski's take on how we go from design to agency. Dembski invokes his explanatory filter as a critical piece of this justification.
Dembski believes that not only design but also agency is found by his argument. This is the message being spread by various and sundry of the "intelligent design" proponents and by Dembski himself in other writings. But is it a secure inference? In his First Things article, and to a lesser extent in his section 2.3 of TDI, Dembski takes biologists to task for avoiding the conclusion of design for biological phenomena. Dembski says that to avoid a design conclusion, biologists uniformly reject one or more of the premises of his argument. But Dembski does not exclude natural selection as a possible cause for events which can be classified as being due to design.
The apparent, but unstated, logic behind the move from design to agency can be given as follows:
It is an error to argue from the casual meanings of regularity, chance, and design when discussing causes for events classified by Dembski's explanatory filter or by TDI. Someone might seek to exclude natural selection from consideration as a source of events that meet the criteria of design by claiming that it is either a regularity or chance. But TDI classifies events, not causes. Dembski points this out himself when saying that the explanatory filter may not always conclude design for an event that we know is due to the action of an intelligent agent, for agents can mimic the results of regularity or chance.
The point is broader than Dembski admits. A causal class cannot be lumped into regularity or chance in advance without begging the question. Specifically, one cannot state that natural selection is either regularity or chance. The events which are due to natural selection must be evaluated by their own properties to establish which category best describes those events. Just as intelligent agents can sometimes produce events which pass for regularity or chance rather than design, so too can natural selection be responsible for events in all 3 categories. It is insufficient to show that some examples of natural selection fall into either regularity or chance explanation categories. One arguing that design never has a physical process as an agent producing an event must show that natural selection is incapable in principle of producing events with the attribute of design. Such a demonstration would have to address both the application of natural selection in biology and also in computer science, where use of the principle of natural selection has been employed in solving very difficult optimization problems.
I've been thinking about this some since I wrote the review, and natural selection is but one of a general class of processes which would produce events meeting the requirements of Dembski's triad. In general, any iterative feedback system with error-correction will be capable of producing events with the attributes of Dembski's triad. Natural selection shows that the "error-correction" part of the process need not have an end solution state for comparison (see Dawkins' discussion of "distant ideal targets" in "The Blind Watchmaker"); comparison of merit of currently-existing instantiations is a sufficient basis for making progress in the parameter-space for many problems. If the parameters of the system are open to change, then at the end of the process those parameters will be closer to the optimal set of parameters than they were at the start. The problem constraints provide the specification of the optimal parameter settings. During the process, different parameter settings are actualized. The trajectory of the system through parameter-space will show that exclusion of other possibilities occurred. This indicates that a much broader class of processes meets Dembski's triad of criteria for the recognition of action of "intelligent agents".
It is time to look more closely at Dembski's design inference, to find out if it does allow us to detect design by the elimination of alternative mechanisms. The design inference is a deductive argument based on the elimination of alternatives. Such arguments only work if the conclusion is the result of exhausting the available alternatives. Dembski assures that this is the case by defining design to be what is left after regularity and chance have been eliminated. Thus, what "design" means depends upon how regularity and chance get eliminated.
Process of Elimination
Dembski offers two somewhat different mechanisms for eliminating regularity. In the first, regularity is recognized if an event has a high probability of occurrence. This is part of his discussion of the explanatory filter. The second method asserts that an event conforms to relevant natural laws, but is not constrained by them, and thus is not attributable to those laws. This method is discussed in relation to Dembski's design inference discussion (p 53). It is not clear that these two methods classify the same set of events as not being due to regularity. This ambiguity increases our uncertainty concerning the residue that is left over to be split between chance and design.
There is a difficulty in discussing these concepts in that the meanings of the terms regularity, chance, and design can become confused with newer meanings which arise from the argument of The Design Inference. It is important to keep the casual meanings separate. Unfortunately, it is not clear that even Dembski manages to keep track of what the terms really mean. For example, even though Dembski clearly explains that design does not imply agency, Dembski offers as the 3 possible categories of explanation in his first example "Regularity", "Chance", and "Agency" (p 11).
According to Dembski, because humans identify human agency using the explanatory filter, the explanatory filter encapsulates our general method for detecting agency. Because TDI is equivalent to the explanatory filter, the conclusion of design in TDI is equivalent to concluding agency. Dembski specifies a triad of criteria — actualization-exclusion-specification — as sufficient for establishing that an intelligent agent has been at work, and finds that design as he uses it is congruent with these criteria.
However, Dembski's triad of criteria for recognition of intelligent agents is also satisfied quite adequately by natural selection. "Actualization" occurs as heritable variation arises. "Exclusion" results as some heritable variations lead to differential reproductive success. "Specification" occurs as environmental conditions specify which variations are preferred. By my reading, biologists can embrace a conclusion of design for an event of biological origin and still attribute that event to the agency of natural selection.
For comparison, I will propose an alternative explanatory filter and discuss various points of difference with Dembski's. My alternative explanatory filter works as follows. An event that cannot be statistically distinguished from a random event is classified as due to chance. An event that conforms to properties of known law-like physical processes is classified as being due to regularity. An event that conforms to known properties of similar events that are due to intelligent agents are classified as due to design. Any event which has not yet been classified is now classified as being due to an unknown cause.
My alternative explanatory filter differs in several critical ways. First, the ordering of decisions is different. Dembski justifies his choice of order with an explication of explanatory priority (p 38-40). But I find Dembski's arguments for arranging to eliminate regularity before eliminating chance to be unconvincing and not reflective of how people ordinarily proceed in finding explanations. Random events conform well to the null hypothesis (that is, that the event is due to chance and not to design or regularity) and should be eliminated first in consideration of causation. Dembski's own example of how regularity has explanatory priority over chance illustrates the fact that his filter has the order reversed.
He illustrates his arrangement of explanatory priority using the example of a pair of loaded dice. Because the loaded dice yield high probabilities for certain faces' coming up, Dembski explains that the explanation to be preferred is regularity. However, Dembski ignores the fact that in order to determine that regularity and not chance is at work with the loaded dice, we must compare the rolls of the dice to the expectation of "fair" dice. Only when chance is rejected can we then entertain the notion that the results for the particular loaded dice in question are due to a regularity. In point of fact, with sufficient testing and knowledge of the circumstances, the loaded dice example resolves into an instance of design, not regularity. This does not mean that design then has explanatory priority. Rather, it illustrates the superior explanatory power of the alternative filter in which the other explanatory classes of causation, chance and regularity, had to be considered and rejected first before design could be concluded.
Second, my explanatory filter has one more alternative classification than Dembski's, that of unknown causation. This alternative recognizes that the set of knowledge used to make a classification can alter the classification. By allowing an event to be classified as due to unknown causation, I simultaneously reduce the number of false classifications that will later be overturned due to the availability of additional information and also identify those events whose circumstances require further study in order to resolve a causative factor. The use of unknown causation as a category is common in those day-to-day operations of humans looking for design in events, such as forensics. Forcing final classification of events under limited knowledge ensures that mistakes in classification will be made in Dembski's explanatory filter.
Third, my alternative explanatory filter retains the common meaning of design as a reliable indicator of agency. We recognize design in our day-to-day life because of prior experience with objects and events designed or caused by intelligent agents. It is important to recognize that there is a difference between a reliable classifier and an oracular design detector. Dembski utilizes the Search for Extra-Terrestrial Intelligence (SETI) project as an example of the detection of design in the absence of particular knowledge of a designer. But SETI does not support the notion that novel design/designer relationships can be detected. SETI is only capable of detecting signals that conform to certain properties of signals known from prior experience of humans communicating via radio wavelengths. SETI works to find events that conform to our prior experience of how intelligent agents utilize radio wavelengths for the purpose of communication. ETI that communicate in ways for which humans have no experience will be completely invisible to, and undetected by, SETI.
In summary, the process of detecting design, as it is done by humans in day-to-day activities, is not accurately captured by Dembski's Explanatory Filter. The order in which classes of causes are eliminated makes a difference. Humans attempting to explain phenomena can and often do find insufficient evidence to make a final determination of either design or any other explanatory category. And when humans use the word design, they typically mean it to carry a real implication of being due to an agent, or designer.
Dembski utilizes the Explanatory Filter and equivalent logical arguments in order to place his criterion of design on a deductive footing. That criterion, complexity-specification, does not help us to identify a cause, or an agent, of an event. Its sole purpose is to detect design as Dembski employs the term. The step from detection of design to implication of an intelligent agent is made via an inductive argument, and shares in the problems of all conclusions drawn from an inductive basis. Dembski argues that a triad of criteria reliably diagnoses the action of an intelligent agent, yet this same triad of criteria fails to exclude natural selection as a possible cause of events that have the attribute of complexity-specification. Somehow, I doubt that natural selection is what Dembski had in mind for the agent of biological design.
The Design Inference is a work with great significance for the group of anti-evolutionists who have embraced "intelligent design" as their organizing principle. TDI is supposed to establish the theoretical foundation for all the rest of the movement. My judgment is that it fails to lay a solid foundation. There are flaws and cracks that can admit the entry of naturalistic causes into the pool of "designed" events. It is unfortunate that Dembski's focus is the establishment of "intelligent design" as an anti-evolutionary alternative, for his insights into elimination of chance hypotheses would appear to have legitimate application to various outstanding research questions, such as resolving certain issues in animal cognition and intelligence. Despite Dembski's commentary in his First Things article, there appears to be no justification for the claim that biologists must now admit design (in its old, agency-laden sense) into biological explanation to any greater degree than it is already used.
Dembski WA. Science and design. First Things 1998 Oct; 86:21-2.
<http://www.firstthings.com/ftissues/ft9810/dembski.html>. Accessed April 8, 1999.
Dembski WA. The Explanatory Filter: A three-part filter for understanding how to separate and identify cause from intelligent design. http://www.origins.org/real/ri9602/dembski.html Accessed March 8, 1999.
Dembski WA.. Intelligent design as a theory of information. Conference on Naturalism, Theism, and the Scientific Enterprise (Austin, Texas). http://www.dla.utexas.edu/depts/philosophy/faculty/koons/ntse/papers/Dembski.html Accessed March 8, 1999.
Thanks to Bob Schadewald and others who gave helpful commentary on drafts of this review.]
[Wesley R. Elsberry is a Ph.D. student in the Department of Wildlife and Fisheries Sciences at Texas A&M University. His research involves computational analysis and modelling of cetacean biosonar and cognition.]
See also The Anti-Evolutionists: William A. Dembski: Online resources and commentary.