Specified complexity
   HOME

TheInfoList



OR:

Specified complexity is a creationist argument introduced by
William Dembski William Albert Dembski (born July 18, 1960) is an American mathematician, philosopher and theologian. He was a proponent of intelligent design (ID) pseudoscience, specifically the concept of specified complexity, and was a senior fellow of the ...
, used by advocates to promote the
pseudoscience Pseudoscience consists of statements, beliefs, or practices that claim to be both scientific and factual but are incompatible with the scientific method. Pseudoscience is often characterized by contradictory, exaggerated or unfalsifiable clai ...
of
intelligent design Intelligent design (ID) is a pseudoscientific argument for the existence of God, presented by its proponents as "an evidence-based scientific theory about life's origins". Numbers 2006, p. 373; " Dcaptured headlines for its bold attempt to ...
. According to Dembski, the concept can formalize a property that singles out patterns that are both ''specified'' and ''complex'', where in Dembski's terminology, a ''specified'' pattern is one that admits short descriptions, whereas a ''complex'' pattern is one that is unlikely to occur by chance. Proponents of intelligent design use specified complexity as one of their two main arguments, alongside
irreducible complexity Irreducible complexity (IC) is the argument that certain biological systems with multiple interacting parts would not function if one of the parts was removed, so supposedly could not have evolved by successive small modifications from earlier l ...
. Dembski argues that it is impossible for specified complexity to exist in patterns displayed by configurations formed by unguided processes. Therefore, Dembski argues, the fact that specified complex patterns can be found in living things indicates some kind of guidance in their formation, which is indicative of intelligence. Dembski further argues that one can rigorously show by applying no-free-lunch theorems the inability of evolutionary algorithms to select or generate configurations of high specified complexity. Dembski states that specified complexity is a reliable marker of design by an
intelligent agent In artificial intelligence, an intelligent agent (IA) is anything which perceives its environment, takes actions autonomously in order to achieve goals, and may improve its performance with learning or may use knowledge. They may be simple or ...
—a central tenet to intelligent design, which Dembski argues for in opposition to modern evolutionary theory. Specified complexity is what Dembski terms an "explanatory filter": one can recognize design by detecting "complex specified information" (CSI). Dembski argues that the unguided emergence of CSI solely according to known
physical laws Scientific laws or laws of science are statements, based on repeated experiments or observations, that describe or predict a range of natural phenomena. The term ''law'' has diverse usage in many cases (approximate, accurate, broad, or narrow) ...
and chance is highly improbable. The concept of specified complexity is widely regarded as mathematically unsound and has not been the basis for further independent work in information theory, in the theory of complex systems, or in
biology Biology is the scientific study of life. It is a natural science with a broad scope but has several unifying themes that tie it together as a single, coherent field. For instance, all organisms are made up of cells that process hereditary i ...
. A study by Wesley Elsberry and
Jeffrey Shallit Jeffrey Outlaw Shallit (born October 17, 1957) is a computer scientist, number theorist, and a noted critic of intelligent design. He is married to Anna Lubiw, also a computer scientist. Early life and education Shallit was born in Philadelp ...
states: "Dembski's work is riddled with inconsistencies, equivocation, flawed use of mathematics, poor scholarship, and misrepresentation of others' results." Another objection concerns Dembski's calculation of probabilities. According to
Martin Nowak Martin Andreas Nowak (born April 7, 1965) is an Austrian-born professor of mathematical biology, at Harvard University since 2003. He is one of the leading researchers in the field that studies the role of cooperation in evolution. Nowak has hel ...
, a Harvard professor of mathematics and evolutionary biology, "We cannot calculate the probability that an eye came about. We don't have the information to make the calculation." Wallis, Claudia (2005)
Time Magazine
printed 15 August 2005, page 32


Definition


Orgel's terminology

The term "specified complexity" was originally coined by
origin of life In biology, abiogenesis (from a- 'not' + Greek bios 'life' + genesis 'origin') or the origin of life is the natural process by which life has arisen from non-living matter, such as simple organic compounds. The prevailing scientific hypothes ...
researcher
Leslie Orgel Leslie Eleazer Orgel FRS (12 January 1927 – 27 October 2007) was a British chemist. He is known for his theories on the origin of life. Biography Leslie Orgel was born in London, England, on . He received his Bachelor of Arts degree in chemi ...
in his 1973 book ''The Origins of Life: Molecules and Natural Selection'', which proposed that RNA could have evolved through Darwinian
natural selection Natural selection is the differential survival and reproduction of individuals due to differences in phenotype. It is a key mechanism of evolution, the change in the heritable traits characteristic of a population over generations. Cha ...
. Orgel used the phrase in discussing the differences between life and non-living structures:
In brief, living organisms are distinguished by their ''specified'' complexity. Crystals are usually taken as the prototypes of simple well-specified structures, because they consist of a very large number of identical molecules packed together in a uniform way. Lumps of granite or random mixtures of polymers are examples of structures that are complex but not specified. The crystals fail to qualify as living because they lack complexity; the mixtures of polymers fail to qualify because they lack specificity.
The phrase was taken up by the creationists Charles Thaxton and Walter L Bradley in a chapter they contributed to the 1994 book ''The Creation Hypothesis'' where they discussed "design detection" and redefined "specified complexity" as a way of measuring information. Another contribution to the book was written by William A. Dembski, who took this up as the basis of his subsequent work. The term was later employed by physicist
Paul Davies Paul Charles William Davies (born 22 April 1946) is an English physicist, writer and broadcaster, a professor in Arizona State University and Director of BEYOND: Center for Fundamental Concepts in Science. He is affiliated with the Institute ...
to qualify the complexity of living organisms:
Living organisms are mysterious not for their complexity per se, but for their tightly specified complexity


Dembski's definition

Whereas Orgel used the term for biological features which are considered in science to have arisen through a process of evolution, Dembski says that it describes features which cannot form through "undirected" evolution—and concludes that it allows one to infer intelligent design. While Orgel employed the concept in a qualitative way, Dembski's use is intended to be quantitative. Dembski's use of the concept dates to his 1998 monograph '' The Design Inference''. Specified complexity is fundamental to his approach to intelligent design, and each of his subsequent books has also dealt significantly with the concept. He has stated that, in his opinion, "if there is a way to detect design, specified complexity is it". Dembski asserts that specified complexity is present in a configuration when it can be described by a pattern that displays a large amount of independently specified information and is also complex, which he defines as having a low probability of occurrence. He provides the following examples to demonstrate the concept: "A single letter of the alphabet is specified without being complex. A long sentence of random letters is complex without being specified. A Shakespearean sonnet is both complex and specified." In his earlier papers Dembski defined ''complex specified information'' (CSI) as being present in a specified event whose probability did not exceed 1 in 10150, which he calls the universal probability bound. In that context, "specified" meant what in later work he called "pre-specified", that is specified by the unnamed designer before any information about the outcome is known. The value of the universal probability bound corresponds to the inverse of the upper limit of "the total number of ossiblespecified events throughout cosmic history", as calculated by Dembski. Anything below this bound has CSI. The terms "specified complexity" and "complex specified information" are used interchangeably. In more recent papers Dembski has redefined the universal probability bound, with reference to another number, corresponding to the total number of bit operations that could possibly have been performed in the entire history of the universe. Dembski asserts that CSI exists in numerous features of living things, such as in DNA and in other functional biological molecules, and argues that it cannot be generated by the only known natural mechanisms of
physical law Scientific laws or laws of science are statements, based on repeated experiments or observations, that describe or predict a range of natural phenomena. The term ''law'' has diverse usage in many cases (approximate, accurate, broad, or narrow) ...
and chance, or by their combination. He argues that this is so because laws can only shift around or lose information, but do not produce it, and because chance can produce complex unspecified information, or simple specified information, but not CSI; he provides a mathematical analysis that he claims demonstrates that law and chance working together cannot generate CSI, either. Moreover, he claims that CSI is
holistic Holism () is the idea that various systems (e.g. physical, biological, social) should be viewed as wholes, not merely as a collection of parts. The term "holism" was coined by Jan Smuts in his 1926 book '' Holism and Evolution''."holism, n." OED On ...
, with the whole being greater than the sum of the parts, and that this decisively eliminates Darwinian evolution as a possible means of its "creation". Dembski maintains that by process of elimination, CSI is best explained as being due to intelligence, and is therefore a reliable indicator of design.


Law of conservation of information

Dembski formulates and proposes a
law of conservation In physics, a conservation law states that a particular measurable property of an isolated physical system does not change as the system evolves over time. Exact conservation laws include conservation of energy, conservation of linear momentum, co ...
of information as follows:
This strong proscriptive claim, that natural causes can only transmit CSI but never originate it, I call the Law of Conservation of Information. Immediate corollaries of the proposed law are the following: # The specified complexity in a closed system of natural causes remains constant or decreases. # The specified complexity cannot be generated spontaneously, originate endogenously or organize itself (as these terms are used in origins-of-life research). # The specified complexity in a closed system of natural causes either has been in the system eternally or was at some point added exogenously (implying that the system, though now closed, was not always closed). # In particular any closed system of natural causes that is also of finite duration received whatever specified complexity it contains before it became a closed system.William A. Dembski (1998
Intelligent Design as a Theory of Information
Dembski notes that the term "Law of Conservation of Information" was previously used by
Peter Medawar Sir Peter Brian Medawar (; 28 February 1915 – 2 October 1987) was a Brazilian-British biologist and writer, whose works on graft rejection and the discovery of acquired immune tolerance have been fundamental to the medical practice of tissu ...
in his book The Limits of Science (1984) "to describe the weaker claim that deterministic laws cannot produce novel information." The actual validity and utility of Dembski's proposed law are uncertain; it is neither widely used by the scientific community nor cited in mainstream scientific literature. A 2002 essay by Erik Tellgren provided a mathematical rebuttal of Dembski's law and concludes that it is "mathematically unsubstantiated."


Specificity

In a more recent paper, Dembski provides an account which he claims is simpler and adheres more closely to the theory of statistical hypothesis testing as formulated by
Ronald Fisher Sir Ronald Aylmer Fisher (17 February 1890 – 29 July 1962) was a British polymath who was active as a mathematician, statistician, biologist, geneticist, and academic. For his work in statistics, he has been described as "a genius who ...
. In general terms, Dembski proposes to view design inference as a statistical test to reject a chance hypothesis P on a space of outcomes Ω. Dembski's proposed test is based on the
Kolmogorov complexity In algorithmic information theory (a subfield of computer science and mathematics), the Kolmogorov complexity of an object, such as a piece of text, is the length of a shortest computer program (in a predetermined programming language) that produ ...
of a pattern ''T'' that is exhibited by an event ''E'' that has occurred. Mathematically, ''E'' is a subset of Ω, the pattern ''T'' specifies a set of outcomes in Ω and ''E'' is a subset of ''T''. Quoting Dembski
Thus, the event ''E'' might be a die toss that lands six and ''T'' might be the composite event consisting of all die tosses that land on an even face.
Kolmogorov complexity provides a measure of the computational resources needed to specify a pattern (such as a DNA sequence or a sequence of alphabetic characters). Given a pattern ''T'', the number of other patterns may have Kolmogorov complexity no larger than that of ''T'' is denoted by φ(''T''). The number φ(''T'') thus provides a ranking of patterns from the simplest to the most complex. For example, for a pattern ''T'' which describes the bacterial
flagellum A flagellum (; ) is a hairlike appendage that protrudes from certain plant and animal sperm cells, and from a wide range of microorganisms to provide motility. Many protists with flagella are termed as flagellates. A microorganism may have f ...
, Dembski claims to obtain the upper bound φ(''T'') ≤ 1020. Dembski defines specified complexity of the pattern ''T'' under the chance hypothesis P as : \sigma= - \log_2 \times \varphi(T) \times \operatorname(T) where P(''T'') is the probability of observing the pattern ''T'', ''R'' is the number of "replicational resources" available "to witnessing agents". ''R'' corresponds roughly to repeated attempts to create and discern a pattern. Dembski then asserts that ''R'' can be bounded by 10120. This number is supposedly justified by a result of Seth Lloyd in which he determines that the number of elementary logic operations that can have been performed in the universe over its entire history cannot exceed 10120 operations on 1090 bits. Dembski's main claim is that the following test can be used to infer design for a configuration: There is a target pattern ''T'' that applies to the configuration and whose specified complexity exceeds 1. This condition can be restated as the inequality : 10^ \times \varphi(T) \times \operatorname(T) < \frac.


Dembski's explanation of specified complexity

Dembski's expression σ is unrelated to any known concept in information theory, though he claims he can justify its relevance as follows: An intelligent agent ''S'' witnesses an event ''E'' and assigns it to some reference class of events Ω and within this reference class considers it as satisfying a specification ''T''. Now consider the quantity φ(''T'') × P(''T'') (where P is the "chance" hypothesis):
Think of S as trying to determine whether an archer, who has just shot an arrow at a large wall, happened to hit a tiny target on that wall by chance. The arrow, let us say, is indeed sticking squarely in this tiny target. The problem, however, is that there are lots of other tiny targets on the wall. Once all those other targets are factored in, is it still unlikely that the archer could have hit any of them by chance? In addition, we need to factor in what I call the replicational resources associated with ''T'', that is, all the opportunities to bring about an event of ''Ts descriptive complexity and improbability by multiple agents witnessing multiple events.
According to Dembski, the number of such "replicational resources" can be bounded by "the maximal number of bit operations that the known, observable universe could have performed throughout its entire multi-billion year history", which according to Lloyd is 10120. However, according to Elsberry and Shallit, " pecified complexityhas not been defined formally in any reputable peer-reviewed mathematical journal, nor (to the best of our knowledge) adopted by any researcher in information theory."


Calculation of specified complexity

Thus far, Dembski's only attempt at calculating the specified complexity of a naturally occurring biological structure is in his book ''No Free Lunch'', for the bacterial flagellum of E. coli. This structure can be described by the pattern "bidirectional rotary motor-driven propeller". Dembski estimates that there are at most 1020 patterns described by four basic concepts or fewer, and so his test for design will apply if : \operatorname(T) < \frac \times 10^. However, Dembski says that the precise calculation of the relevant probability "has yet to be done", although he also claims that some methods for calculating these probabilities "are now in place". These methods assume that all of the constituent parts of the flagellum must have been generated completely at random, a scenario that biologists do not seriously consider. He justifies this approach by appealing to
Michael Behe Michael Joseph Behe ( ; born January 18, 1952) is an American biochemist and author, widely known as an advocate of the pseudoscientific principle of intelligent design (ID). He serves as professor of biochemistry at Lehigh University in Pennsy ...
's concept of "
irreducible complexity Irreducible complexity (IC) is the argument that certain biological systems with multiple interacting parts would not function if one of the parts was removed, so supposedly could not have evolved by successive small modifications from earlier l ...
" (IC), which leads him to assume that the flagellum could not come about by any gradual or step-wise process. The validity of Dembski's particular calculation is thus wholly dependent on Behe's IC concept, and therefore susceptible to its criticisms, of which there are many. To arrive at the ranking upper bound of 1020 patterns, Dembski considers a specification pattern for the flagellum defined by the (natural language) predicate "bidirectional rotary motor-driven propeller", which he regards as being determined by four independently chosen basic concepts. He furthermore assumes that English has the capability to express at most 105 basic concepts (an upper bound on the size of a dictionary). Dembski then claims that we can obtain the rough upper bound of : 10^= 10^5 \times 10^5 \times 10^5 \times 10^5 for the set of patterns described by four basic concepts or fewer. From the standpoint of Kolmogorov complexity theory, this calculation is problematic. Quoting Ellsberry and Shallit "Natural language specification without restriction, as Dembski tacitly permits, seems problematic. For one thing, it results in the Berry paradox". These authors add: "We have no objection to natural language specifications per se, provided there is some evident way to translate them to Dembski's formal framework. But what, precisely, is the space of events Ω here?"


Criticism

The soundness of Dembski's concept of specified complexity and the validity of arguments based on this concept are widely disputed. A frequent criticism (see Elsberry and Shallit) is that Dembski has used the terms "complexity", "information" and "improbability" interchangeably. These numbers measure properties of things of different types: Complexity measures how hard it is to describe an object (such as a bitstring), information is how much the uncertainty about the state of an object is reduced by knowing the state of another object or system, and improbability measures how unlikely an event is given a probability distribution. On page 150 of ''No Free Lunch'' Dembski claims he can demonstrate his thesis mathematically: ''"In this section I will present an in-principle mathematical argument for why natural causes are incapable of generating complex specified information."'' When Tellgren investigated Dembski's "Law of Conservation of Information” using a more formal approach, he concluded it is mathematically unsubstantiated. Dembski responded in part that he is not "in the business of offering a strict
mathematical proof A mathematical proof is an inferential argument for a mathematical statement, showing that the stated assumptions logically guarantee the conclusion. The argument may use other previously established statements, such as theorems; but every proo ...
for the inability of material mechanisms to generate specified complexity".
Jeffrey Shallit Jeffrey Outlaw Shallit (born October 17, 1957) is a computer scientist, number theorist, and a noted critic of intelligent design. He is married to Anna Lubiw, also a computer scientist. Early life and education Shallit was born in Philadelp ...
states that Demski's mathematical argument has multiple problems, for example; a crucial calculation on page 297 of ''No Free Lunch'' is off by a factor of approximately 1065.
Jeffrey Shallit Jeffrey Outlaw Shallit (born October 17, 1957) is a computer scientist, number theorist, and a noted critic of intelligent design. He is married to Anna Lubiw, also a computer scientist. Early life and education Shallit was born in Philadelp ...
(2002
A review of Dembski's ''No Free Lunch''
/ref> Dembski's calculations show how a simple smooth function cannot gain information. He therefore concludes that there must be a designer to obtain CSI. However, natural selection has a branching mapping from one to many (replication) followed by pruning mapping of the many back down to a few (selection). When information is replicated, some copies can be differently modified while others remain the same, allowing information to increase. These increasing and reductional mappings were not modeled by Dembski. In other words, Dembski's calculations do not model birth and death. This basic flaw in his modeling renders all of Dembski's subsequent calculations and reasoning in ''No Free Lunch'' irrelevant because his basic model does not reflect reality. Since the basis of ''No Free Lunch'' relies on this flawed argument, the entire thesis of the book collapses. According to Martin Nowak, a Harvard professor of mathematics and evolutionary biology "We cannot calculate the probability that an eye came about. We don't have the information to make the calculation". Dembski's critics note that specified complexity, as originally defined by Leslie Orgel, is precisely what Darwinian evolution is supposed to create. Critics maintain that Dembski uses "complex" as most people would use "absurdly improbable". They also claim that his argument is
circular Circular may refer to: * The shape of a circle * ''Circular'' (album), a 2006 album by Spanish singer Vega * Circular letter (disambiguation) ** Flyer (pamphlet), a form of advertisement * Circular reasoning, a type of logical fallacy * Circular ...
: CSI cannot occur naturally because Dembski has defined it thus. They argue that to successfully demonstrate the existence of CSI, it would be necessary to show that some biological feature undoubtedly has an extremely low probability of occurring by any natural means whatsoever, something which Dembski and others have almost never attempted to do. Such calculations depend on the accurate assessment of numerous contributing probabilities, the determination of which is often necessarily subjective. Hence, CSI can at most provide a "very high probability", but not absolute certainty. Another criticism refers to the problem of "arbitrary but specific outcomes". For example, if a coin is tossed randomly 1000 times, the probability of any particular outcome occurring is roughly one in 10300. For any particular specific outcome of the coin-tossing process, the ''a priori'' probability (probability measured before event happens) that this pattern occurred is thus one in 10300, which is astronomically smaller than Dembski's universal probability bound of one in 10150. Yet we know that the ''post hoc'' probability (probabilitly as observed after event occurs) of its happening is exactly one, since we observed it happening. This is similar to the observation that it is unlikely that any given person will win a lottery, but, eventually, a lottery will have a winner; to argue that it is very unlikely that any one player would win is not the same as proving that there is the same chance that no one will win. Similarly, it has been argued that "a space of possibilities is merely being explored, and we, as pattern-seeking animals, are merely imposing patterns, and therefore targets, after the fact." Apart from such theoretical considerations, critics cite reports of evidence of the kind of evolutionary "spontanteous generation" that Dembski claims is too improbable to occur naturally. For example, in 1982, B.G. Hall published research demonstrating that after removing a gene that allows sugar digestion in certain bacteria, those bacteria, when grown in media rich in sugar, rapidly evolve new sugar-digesting enzymes to replace those removed. Another widely cited example is the discovery of nylon eating bacteria that produce enzymes only useful for digesting synthetic materials that did not exist prior to the invention of
nylon Nylon is a generic designation for a family of synthetic polymers composed of polyamides ( repeating units linked by amide links).The polyamides may be aliphatic or semi-aromatic. Nylon is a silk-like thermoplastic, generally made from pe ...
in 1935. Other commentators have noted that evolution through selection is frequently used to design certain electronic, aeronautic and automotive systems which are considered problems too complex for human "intelligent designers".Evolutionary algorithms now surpass human designers
New Scientist, 28 July 2007
This contradicts the argument that an intelligent designer is required for the most complex systems. Such evolutionary techniques can lead to designs that are difficult to understand or evaluate since no human understands which trade-offs were made in the evolutionary process, something which mimics our poor understanding of biological systems. Dembski's book ''No Free Lunch'' was criticised for not addressing the work of researchers who use computer simulations to investigate
artificial life Artificial life (often abbreviated ALife or A-Life) is a field of study wherein researchers examine systems related to natural life, its processes, and its evolution, through the use of simulations with computer models, robotics, and biochemistry ...
. According to Shallit:
The field of artificial life evidently poses a significant challenge to Dembski's claims about the failure of evolutionary algorithms to generate complexity. Indeed, artificial life researchers regularly find their simulations of evolution producing the sorts of novelties and increased complexity that Dembski claims are impossible.


See also

*
List of topics characterized as pseudoscience This is a list of topics that have, either currently or in the past, been characterized as pseudoscience by academics or researchers. Detailed discussion of these topics may be found on their main pages. These characterizations were made in the ...
*
Teleological argument The teleological argument (from ; also known as physico-theological argument, argument from design, or intelligent design argument) is an argument for the existence of God or, more generally, that complex functionality in the natural world w ...
* Texas sharpshooter fallacy


Notes and references


External links


Not a Free Lunch But a Box of Chocolates - A critique of William Dembski's book ''No Free Lunch''
by Richard Wein, from TalkOrigins

by Rich Baldwin, from Information Theory and Creationism, compiled by Ian Musgrave and Rich Baldwin

from the Boston Review

by Thomas D. Schneider.
William Dembski's treatment of the No Free Lunch theorems is written in jello
by No Free Lunch theorems co-founder,
David Wolpert David Hilton Wolpert is an American mathematician, physicist and computer scientist. He is a professor at Santa Fe Institute. He is the author of three books, three patents, over one hundred refereed papers, and has received numerous awards. His ...

The Evolution List - Genetic ID and the Explanatory Filter
by Allen MacNeill.
Design Inference Website
- The writing of William A. Dembski

-
Victor J. Stenger Victor John Stenger (; January 29, 1935 – August 25, 2014) was an American particle physicist, philosopher, author, and religious skeptic. Following a career as a research scientist in the field of particle physics, Stenger was associated ...

Darwin@Home Web site
- open-source software that demonstrates evolution in artificial life, written by Gerald de Jong {{DEFAULTSORT:Specified Complexity Intelligent design Creationist objections to evolution Denialism Pseudoscience