HOME

TheInfoList



OR:

In
cybernetics Cybernetics is a wide-ranging field concerned with circular causality, such as feedback, in regulatory and purposive systems. Cybernetics is named after an example of circular causal feedback, that of steering a ship, where the helmsperson ma ...
, the term variety denotes the total number of distinguishable elements of a set, most often the set of states, inputs, or outputs of a finite-state machine or transformation, or the binary logarithm of the same quantity. Variety is used in cybernetics as an
information theory Information theory is the scientific study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. ...
that is easily related to deterministic finite automata, and less formally as a conceptual tool for thinking about organization, regulation, and stability. It is an early theory of
complexity Complexity characterises the behaviour of a system or model whose components interact in multiple ways and follow local rules, leading to nonlinearity, randomness, collective dynamics, hierarchy, and emergence. The term is generally used to ch ...
in
automata An automaton (; plural: automata or automatons) is a relatively self-operating machine, or control mechanism designed to automatically follow a sequence of operations, or respond to predetermined instructions.Automaton – Definition and More ...
,
complex systems A complex system is a system composed of many components which may interact with each other. Examples of complex systems are Earth's global climate, organisms, the human brain, infrastructure such as power grid, transportation or communication sy ...
, and
operations research Operations research ( en-GB, operational research) (U.S. Air Force Specialty Code: Operations Analysis), often shortened to the initialism OR, is a discipline that deals with the development and application of analytical methods to improve decis ...
.


Overview

The term "variety" was introduced by W. Ross Ashby to extend his analysis of machines to their set of possible behaviors. Ashby says:
The word variety, in relation to a set of distinguishable elements, will be used to mean either (i) the number of distinct elements, or (ii) the logarithm to the base 2 of the number, the context indicating the sense used.
In the second case, variety is measured in bits. For example, a machine with states \ has a variety of four states or two bits. The variety of a
sequence In mathematics, a sequence is an enumerated collection of objects in which repetitions are allowed and order matters. Like a set, it contains members (also called ''elements'', or ''terms''). The number of elements (possibly infinite) is called ...
or
multiset In mathematics, a multiset (or bag, or mset) is a modification of the concept of a set that, unlike a set, allows for multiple instances for each of its elements. The number of instances given for each element is called the multiplicity of that e ...
is the number of distinct symbols in it. For example, the sequence a,b,c,c,c,d has a variety of four. As a measure of uncertainty, variety is directly related to information: \text = - \text. Since the number of distinguishable elements depends on both the observer and the set, "the observer and his powers of discrimination may have to be specified if the variety is to be well defined". Gordon Pask distinguished between the variety of the chosen reference frame and the variety of the system the observer builds up within the reference frame. The reference frame consists of a state space and the set of measurements available to the observer, which have total variety \log_2(n), where n is the number of states in the state space. The system the observer builds up begins with the full variety \log_2(n), which is reduced as the observer loses uncertainty about the state by learning to predict the system. If the observer can perceive the system as a deterministic machine in the given reference frame, observation may reduce the variety to zero as the machine becomes completely predictable. Laws of nature constrain the variety of phenomena by disallowing certain behavior. Ashby made two observations he considered laws of nature, the law of experience and the law of requisite variety. The law of experience holds that machines under input tend to lose information about their original state, and the law of requisite variety states a necessary, though not sufficient, condition for a regulator to exert anticipatory control by responding to its current input (rather than the previous output as in error-controlled regulation).


Law of experience

The ''law of experience'' refers to the observation that the variety of states exhibited by a deterministic machine in isolation cannot increase, and a set of identical machines fed the same inputs cannot exhibit increasing variety of states, and tend to synchronize instead.
Some name is necessary by which this phenomenon can be referred to. I shall call it the law of Experience. It can be described more vividly by the statement that information put in by change at a parameter tends to destroy and replace information about the system's initial state.
This is a consequence of the ''decay of variety'': a deterministic transformation cannot increase the variety of a set. As a result, an observer's uncertainty about the state of the machine either remains constant or decreases with time. Ashby shows that this holds for machines with inputs as well. Under any constant input P_1 the machines' states move toward any attractors that exist in the corresponding transformation and some may synchronize at these points. If the input changes to some other input P_2 and the machines' behavior enacts a different transformation, more than one of these attractors may sit in the same basin of attraction under P_2. States which arrived and possibly synchronized at those attractors under P_1 then synchronize further under P_2. "In other words," Ashby says, "changes at the input of a transducer tend to make the system's state (at a given moment) less dependent on the transducer's individual initial state and more dependent on the particular sequence of parameter-values used as input." While there is a law of non-increase, there is only a tendency to decrease, since the variety can hold steady without decreasing if the set undergoes a one-to-one transformation, or if the states have synchronized into a subset for which this is the case. In the
formal language In logic, mathematics, computer science, and linguistics, a formal language consists of words whose letters are taken from an alphabet and are well-formed according to a specific set of rules. The alphabet of a formal language consists of sym ...
analysis of finite machines, an input sequence that synchronizes identical machines (no matter the variety of their initial states) is called a
synchronizing word In computer science, more precisely, in the theory of deterministic finite automata (DFA), a synchronizing word or reset sequence is a word in the input alphabet of the DFA that sends any state of the DFA to one and the same state.Avraham Trakhtm ...
.


Law of requisite variety

Ashby used variety to analyze the problem of
regulation Regulation is the management of complex systems according to a set of rules and trends. In systems theory, these types of rules exist in various fields of biology and society, but the term has slightly different meanings according to context. ...
by considering a two-player game, where one player, D, supplies disturbances which another player, R, must regulate to ensure acceptable outcomes. D and R each have a set of available moves, which choose the outcome from a table with as many rows as D has moves and as many columns as R has moves. R is allowed full knowledge of D's move, and must pick moves in response so that the outcome is acceptable. Since many games pose no difficulty for R, the table is chosen so that no outcome is repeated in any column, which ensures that in the corresponding game any change in D's move means a change in outcome, unless R has a move to keep the outcome from changing. With this restriction, if R never changes moves, the outcome fully depends on D's choice, while if multiple moves are available to R it can reduce the variety of outcomes, if the table allows it, dividing by as much as its own variety of moves. \begin & & R \\ & & \begin \alpha & \beta & \gamma \end \\ \hline D & \begin 1 \\ 2 \\ 3 \\ 4 \\ 5 \\ 6 \end & \begin \mathbf & f & d \\ \mathbf & e & c \\ c & d & \mathbf \\ d & c & \mathbf \\ e & \mathbf & f \\ f & \mathbf & e \\ \end \end The ''law of requisite variety'' is that a deterministic strategy for R can at best limit the variety in outcomes to \tfrac, and only adding variety in R's moves can reduce the variety of outcomes: "only variety can destroy variety". For example, in the table above, R has a strategy (shown in bold) to reduce the variety in outcomes to , \, = 2 = \tfrac, which is \tfrac in this case. It is not possible for R to reduce the outcomes any further and still respond to all potential moves from D, but it is possible that another table of the same shape would not allow R to do so well. Requisite variety is necessary, but not sufficient to control the outcomes. If R and D are machines, they cannot possibly choose more moves than they have states. Thus, a perfect regulator must have at least as many distinguishable states as the phenomenon it is intended to regulate (the table must be square, or wider). Stated in bits, the law is V_O \ge V_D - V_R. In Shannon's information theory, D, R, and E are information sources. The condition that if R never changes moves, the uncertainty in outcomes is no less than the uncertainty in D's move is expressed as H(E, R) \ge H(D, R), and since R's strategy is a deterministic function of D set H(R, D) = 0. With the rules of the game expressed this way, it can be shown that H(E) \ge H(D) - H(R). Ashby described the law of requisite variety as related to the tenth theorem in Shannon's
Mathematical Theory of Communication "A Mathematical Theory of Communication" is an article by mathematician Claude E. Shannon published in ''Bell System Technical Journal'' in 1948. It was renamed ''The Mathematical Theory of Communication'' in the 1949 book of the same name, a sma ...
(1948):
This law (of which Shannon's theorem 10 relating to the suppression of noise is a special case) says that if a certain quantity of disturbance is prevented by a regulator from reaching some essential variables, then that regulator must be capable of exerting at least that quantity of selection.
Ashby saw this law as relevant to problems in biology such as
homeostasis In biology, homeostasis ( British also homoeostasis) (/hɒmɪə(ʊ)ˈsteɪsɪs/) is the state of steady internal, physical, and chemical conditions maintained by living systems. This is the condition of optimal functioning for the organism and ...
, and a "wealth of possible applications". Later, in 1970, Conant working with Ashby produced the good regulator theorem which required autonomous
systems A system is a group of interacting or interrelated elements that act according to a set of rules to form a unified whole. A system, surrounded and influenced by its environment, is described by its boundaries, structure and purpose and expresse ...
to acquire an internal model of their environment to persist and achieve stability (e.g.
Nyquist stability criterion In control theory and stability theory, the Nyquist stability criterion or Strecker–Nyquist stability criterion, independently discovered by the German electrical engineer at Siemens in 1930 and the Swedish-American electrical engineer Harry ...
) or
dynamic equilibrium In chemistry, a dynamic equilibrium exists once a reversible reaction occurs. Substances transition between the reactants and products at equal rates, meaning there is no net change. Reactants and products are formed at such a rate that the co ...
. Boisot and McKelvey updated this law to the "law of requisite complexity", that holds that, in order to be efficaciously adaptive, the internal
complexity Complexity characterises the behaviour of a system or model whose components interact in multiple ways and follow local rules, leading to nonlinearity, randomness, collective dynamics, hierarchy, and emergence. The term is generally used to ch ...
of a system must match the external complexity it confronts. A further practical application of this law is the view that
information systems An information system (IS) is a formal, sociotechnical, organizational system designed to collect, process, store, and distribute information. From a sociotechnical perspective, information systems are composed by four components: task, people ...
(IS) alignment is a continuous coevolutionary process that reconciles top-down ‘rational designs’ and bottom-up ‘emergent processes’ of consciously and coherently interrelating all components of the Business/IS relationships in order to contribute to an organization’s performance over time. The application in project management of the law of requisite complexity is the model of positive, appropriate and negative complexity proposed by Stefan Morcov.


Applications

Applications to organization and management were immediately apparent to Ashby. One implication is that individuals have a finite capacity for processing information, and beyond this limit what matters is the organization between individuals.
Thus the limitation which holds over a team of ''n'' men may be much higher, perhaps ''n'' times as high, as the limitation holding over the individual man. To make use of the higher limit, however, the team must be efficiently organized; and until recently our understanding of organization has been pitifully small.
Stafford Beer took up this analysis in his writings on management cybernetics. Beer defines variety as "the total number of ''possible'' states of a system, or of an element of a system".Beer (1981) Beer restates the Law of Requisite Variety as "Variety absorbs variety." Stated more simply, the logarithmic measure of variety represents the minimum number of choices (by binary chop) needed to resolve
uncertainty Uncertainty refers to epistemic situations involving imperfect or unknown information. It applies to predictions of future events, to physical measurements that are already made, or to the unknown. Uncertainty arises in partially observable ...
. Beer used this to allocate the management resources necessary to maintain process viability. The cybernetician Frank George discussed the variety of teams competing in games like football or rugby to produce goals or tries. A winning chess player might be said to have more variety than his losing opponent. Here a simple
ordering Order, ORDER or Orders may refer to: * Categorization, the process in which ideas and objects are recognized, differentiated, and understood * Heterarchy, a system of organization wherein the elements have the potential to be ranked a number of ...
is implied. The
attenuation In physics, attenuation (in some contexts, extinction) is the gradual loss of flux intensity through a medium. For instance, dark glasses attenuate sunlight, lead attenuates X-rays, and water and air attenuate both light and sound at var ...
and amplification of variety were major themes in Stafford Beer's work in management (the profession of control, as he called it). The number of staff needed to answer telephones, control crowds or tend to patients are clear examples. The application of natural and analogue signals to variety analysis require an estimate of Ashby's "powers of discrimination" (see above quote). Given the butterfly effect of
dynamical systems In mathematics, a dynamical system is a system in which a function describes the time dependence of a point in an ambient space. Examples include the mathematical models that describe the swinging of a clock pendulum, the flow of water in a ...
care must be taken before quantitative measures can be produced. Small quantities, which might be overlooked, can have big effects. In his ''Designing Freedom'' Stafford Beer discusses the patient in a hospital with a temperature denoting fever. Action must be taken immediately to isolate the patient. Here no amount of variety recording the ''patients' average temperature'' would detect this small signal which might have a big effect. Monitoring is required on individuals thus amplifying variety (see ''Algedonic alerts'' in the viable system model or VSM). Beer's work in management cybernetics and VSM is largely based on variety engineering. Further applications involving Ashby's view of state counting include the analysis of digital bandwidth requirements, redundancy and
software bloat Software bloat is a process whereby successive versions of a computer program become perceptibly slower, use more memory, disk space or processing power, or have higher hardware requirements than the previous version, while making only dubious us ...
, the bit representation of data types and indexes, analogue to digital conversion, the bounds on
finite state machines A finite-state machine (FSM) or finite-state automaton (FSA, plural: ''automata''), finite automaton, or simply a state machine, is a mathematical model of computation. It is an abstract machine that can be in exactly one of a finite number ...
and
data compression In information theory, data compression, source coding, or bit-rate reduction is the process of encoding information using fewer bits than the original representation. Any particular compression is either lossy or lossless. Lossless compressio ...
. See also, e.g.,
Excited state In quantum mechanics, an excited state of a system (such as an atom, molecule or nucleus) is any quantum state of the system that has a higher energy than the ground state (that is, more energy than the absolute minimum). Excitation refers to ...
,
State (computer science) In information technology and computer science, a system is described as stateful if it is designed to remember preceding events or user interactions; the remembered information is called the state of the system. The set of states a system can oc ...
, State pattern, State (controls) and Cellular automaton. Requisite Variety can be seen in Chaitin's
Algorithmic information theory Algorithmic information theory (AIT) is a branch of theoretical computer science that concerns itself with the relationship between computation and information of computably generated objects (as opposed to stochastically generated), such as str ...
where a longer, higher variety program or finite state machine produces incompressible output with more variety or information content. In general a description of the required inputs and outputs is established then encoded with the minimum variety necessary. The mapping of input bits to output bits can then produce an estimate of the minimum hardware or software components necessary to produce the desired
control Control may refer to: Basic meanings Economics and business * Control (management), an element of management * Control, an element of management accounting * Comptroller (or controller), a senior financial officer in an organization * Controlli ...
behaviour; for example, in a piece of
computer software Software is a set of computer programs and associated documentation and data. This is in contrast to hardware, from which the system is built and which actually performs the work. At the lowest programming level, executable code consist ...
or
computer hardware Computer hardware includes the physical parts of a computer, such as the case, central processing unit (CPU), random access memory (RAM), monitor, mouse, keyboard, computer data storage, graphics card, sound card, speakers and motherboard. ...
. Variety is one of nine requisites that are required by an ethical regulator.M. Ashby
"Ethical Regulators and Super-Ethical Systems"
2017


See also

*
Cardinality In mathematics, the cardinality of a set is a measure of the number of elements of the set. For example, the set A = \ contains 3 elements, and therefore A has a cardinality of 3. Beginning in the late 19th century, this concept was generalized ...
*
Complexity Complexity characterises the behaviour of a system or model whose components interact in multiple ways and follow local rules, leading to nonlinearity, randomness, collective dynamics, hierarchy, and emergence. The term is generally used to ch ...
*
Degrees of freedom Degrees of freedom (often abbreviated df or DOF) refers to the number of independent variables or parameters of a thermodynamic system. In various scientific fields, the word "freedom" is used to describe the limits to which physical movement or ...
*
Power set In mathematics, the power set (or powerset) of a set is the set of all subsets of , including the empty set and itself. In axiomatic set theory (as developed, for example, in the ZFC axioms), the existence of the power set of any set is post ...
* Practopoiesis * Waterbed theory * Good regulator * Ethical regulator *
State (Computer Science) In information technology and computer science, a system is described as stateful if it is designed to remember preceding events or user interactions; the remembered information is called the state of the system. The set of states a system can oc ...
* Myhill-Nerode Theorem * Space complexity * Project Complexity


References


Further reading

* Ashby, W. R. 1956, An Introduction to Cybernetics, Chapman & Hall, 1956, (also available i
electronic form as a PDF
from ''Principia Cybernetica'') * Ashby, W. R. 1958
Requisite Variety and its implications for the control of complex systems
Cybernetica (Namur) Vol. 1, No. 2, 1958. * Ashby, W. R. 1960, Design for a brain; the origin of adaptive behavior, 2nd ed.
Electronic versions on Internet Archive
. * Beer, S. 1974, Designing Freedom, CBC Learning Systems, Toronto, 1974; and John Wiley, London and New York, 1975. Translated into Spanish and Japanese. * Beer, S. 1975, Platform for Change, John Wiley, London and New York. Reprinted with corrections 1978. * Beer, S. 1979, The Heart of Enterprise, John Wiley, London and New York. Reprinted with corrections 1988. * Beer, S. 1981, Brain of the Firm; Second Edition (much extended), John Wiley, London and New York. Reprinted 1986, 1988. Translated into Russian. * Beer, S. 1985, Diagnosing the System for Organisations; John Wiley, London and New York. Translated into Italian and Japanese. Reprinted 1988, 1990, 1991. * Conant, R. 1981, Mechanisms of Intelligence: Ross Ashby's papers and writings, Intersystems Publications, .


External links



in the
Principia Cybernetica Web Principia Cybernetica is an international cooperation of scientists in the field of cybernetics and systems science, especially known for their website, Principia Cybernetica. They have dedicated their organization to what they call "a computer-su ...
, 2001.
Systems concepts and 9/11
Allenna Leonard Allenna Leonard is an American cyberneticist, consultant and director of Team Syntegrity International, specializing in the application of Stafford Beer's Viable System Model and Syntegration. She was president of the International Society for ...
on Requisite Variety
All references to The Law of Requisite Variety
in Ross Ashby's journal 1953–1961.
Management Cybernetics: The Law of Requisite Variety
Livas short introductory videos on
YouTube YouTube is a global online video sharing and social media platform headquartered in San Bruno, California. It was launched on February 14, 2005, by Steve Chen, Chad Hurley, and Jawed Karim. It is owned by Google, and is the second mo ...

Practopoiesis
How biological systems get their variety
The 1973 CBC Massey Lectures, "Designing Freedom"
{{DEFAULTSORT:Variety (Cybernetics) Cybernetics