HOME

TheInfoList



OR:

The von Neumann architecture — also known as the von Neumann model or Princeton architecture — is a
computer architecture In computer engineering, computer architecture is a description of the structure of a computer system made from component parts. It can sometimes be a high-level description that ignores details of the implementation. At a more detailed level, the ...
based on a 1945 description by
John von Neumann John von Neumann (; hu, Neumann János Lajos, ; December 28, 1903 – February 8, 1957) was a Hungarian-American mathematician, physicist, computer scientist, engineer and polymath. He was regarded as having perhaps the widest cove ...
, and by others, in the ''
First Draft of a Report on the EDVAC The ''First Draft of a Report on the EDVAC'' (commonly shortened to ''First Draft'') is an incomplete 101-page document written by John von Neumann and distributed on June 30, 1945 by Herman Goldstine, security officer on the classified ENIAC pro ...
''. The document describes a design architecture for an electronic
digital computer A computer is a machine that can be programmed to carry out sequences of arithmetic or logical operations (computation) automatically. Modern digital electronic computers can perform generic sets of operations known as programs. These program ...
with these components: * A processing unit with both an arithmetic logic unit and processor registers * A
control unit The control unit (CU) is a component of a computer's central processing unit (CPU) that directs the operation of the processor. A CU typically uses a binary decoder to convert coded instructions into timing and control signals that direct the op ...
that includes an instruction register and a program counter *
Memory Memory is the faculty of the mind by which data or information is encoded, stored, and retrieved when needed. It is the retention of information over time for the purpose of influencing future action. If past events could not be remembered ...
that stores
data In the pursuit of knowledge, data (; ) is a collection of discrete Value_(semiotics), values that convey information, describing quantity, qualitative property, quality, fact, statistics, other basic units of meaning, or simply sequences of sy ...
and instructions * External
mass storage In computing, mass storage refers to the storage of large amounts of data in a persisting and machine-readable fashion. In general, the term is used as large in relation to contemporaneous hard disk drives, but it has been used large in relati ...
*
Input and output In computing, input/output (I/O, or informally io or IO) is the communication between an information processing system, such as a computer, and the outside world, possibly a human or another information processing system. Inputs are the signals ...
mechanisms.. The term "von Neumann architecture" has evolved to refer to any stored-program computer in which an instruction fetch and a data operation cannot occur at the same time (since they share a common
bus A bus (contracted from omnibus, with variants multibus, motorbus, autobus, etc.) is a road vehicle that carries significantly more passengers than an average car or van. It is most commonly used in public transport, but is also in use for cha ...
). This is referred to as the
von Neumann bottleneck The von Neumann architecture — also known as the von Neumann model or Princeton architecture — is a computer architecture based on a 1945 description by John von Neumann, and by others, in the '' First Draft of a Report on the EDVAC''. T ...
, which often limits the performance of the corresponding system. The design of a von Neumann architecture machine is simpler than in a
Harvard architecture The Harvard architecture is a computer architecture with separate storage and signal pathways for instructions and data. It contrasts with the von Neumann architecture, where program instructions and data share the same memory and pathways. ...
machine—which is also a stored-program system, yet has one dedicated set of address and data buses for reading and writing to memory, and another set of address and data buses to fetch instructions. A stored-program digital computer keeps both program instructions and data in read–write,
random-access memory Random-access memory (RAM; ) is a form of computer memory that can be read and changed in any order, typically used to store working data and machine code. A random-access memory device allows data items to be read or written in almost the ...
(RAM). Stored-program computers were an advancement over the program-controlled computers of the 1940s, such as the Colossus and the
ENIAC ENIAC (; Electronic Numerical Integrator and Computer) was the first programmable, electronic, general-purpose digital computer, completed in 1945. There were other computers that had these features, but the ENIAC had all of them in one pac ...
. Those were programmed by setting switches and inserting patch cables to route data and control signals between various functional units. The vast majority of modern computers use the same memory for both data and program instructions, but have caches between the CPU and memory, and, for the caches closest to the CPU, have separate caches for instructions and data, so that most instruction and data fetches use separate buses ( split cache architecture).


History

The earliest computing machines had fixed programs. Some very simple computers still use this design, either for simplicity or training purposes. For example, a desk calculator (in principle) is a fixed program computer. It can do basic mathematics, but it cannot run a
word processor A word processor (WP) is a device or computer program that provides for input, editing, formatting, and output of text, often with some additional features. Early word processors were stand-alone devices dedicated to the function, but current ...
or games. Changing the program of a fixed-program machine requires rewiring, restructuring, or redesigning the machine. The earliest computers were not so much "programmed" as "designed" for a particular task. "Reprogramming" – when possible at all – was a laborious process that started with flowcharts and paper notes, followed by detailed engineering designs, and then the often-arduous process of physically rewiring and rebuilding the machine. It could take three weeks to set up and debug a program on
ENIAC ENIAC (; Electronic Numerical Integrator and Computer) was the first programmable, electronic, general-purpose digital computer, completed in 1945. There were other computers that had these features, but the ENIAC had all of them in one pac ...
. With the proposal of the stored-program computer, this changed. A stored-program computer includes, by design, an instruction set, and can store in memory a set of instructions (a program) that details the computation. A stored-program design also allows for
self-modifying code In computer science, self-modifying code (SMC) is code that alters its own instructions while it is executing – usually to reduce the instruction path length and improve performance or simply to reduce otherwise repetitively similar code, ...
. One early motivation for such a facility was the need for a program to increment or otherwise modify the address portion of instructions, which operators had to do manually in early designs. This became less important when
index register An index register in a computer's CPU is a processor register (or an assigned memory location) used for pointing to operand addresses during the run of a program. It is useful for stepping through strings and arrays. It can also be used for hol ...
s and
indirect addressing Addressing modes are an aspect of the instruction set architecture in most central processing unit (CPU) designs. The various addressing modes that are defined in a given instruction set architecture define how the machine language instructions ...
became usual features of machine architecture. Another use was to embed frequently used data in the instruction stream using immediate addressing. Self-modifying code has largely fallen out of favor, since it is usually hard to understand and debug, as well as being inefficient under modern processor pipelining and caching schemes.


Capabilities

On a large scale, the ability to treat instructions as data is what makes
assemblers Assembler may refer to: Arts and media * Nobukazu Takemura, avant-garde electronic musician, stage name Assembler * Assemblers, a fictional race in the ''Star Wars'' universe * Assemblers, an alternative name of the superhero group Champions of A ...
,
compiler In computing, a compiler is a computer program that translates computer code written in one programming language (the ''source'' language) into another language (the ''target'' language). The name "compiler" is primarily used for programs tha ...
s, linkers, loaders, and other automated programming tools possible. It makes "programs that write programs" possible. This has made a sophisticated self-hosting computing ecosystem flourish around von Neumann architecture machines. Some high level languages leverage the von Neumann architecture by providing an abstract, machine-independent way to manipulate executable code at runtime (e.g., LISP), or by using runtime information to tune
just-in-time compilation In computing, just-in-time (JIT) compilation (also dynamic translation or run-time compilations) is a way of executing computer code that involves compilation during execution of a program (at run time) rather than before execution. This may co ...
(e.g. languages hosted on the Java virtual machine, or languages embedded in
web browsers A web browser is application software for accessing websites. When a user requests a web page from a particular website, the browser retrieves its files from a web server and then displays the page on the user's screen. Browsers are used on ...
). On a smaller scale, some repetitive operations such as
BITBLT Bit blit (also written BITBLT, BIT BLT, BitBLT, Bit BLT, Bit Blt etc., which stands for ''bit block transfer'') is a data operation commonly used in computer graphics in which several bitmaps are combined into one using a ''Truth_table#Binary_opera ...
or pixel and vertex shaders can be accelerated on general purpose processors with just-in-time compilation techniques. This is one use of self-modifying code that has remained popular.


Development of the stored-program concept

The mathematician
Alan Turing Alan Mathison Turing (; 23 June 1912 – 7 June 1954) was an English mathematician, computer scientist, logician, cryptanalyst, philosopher, and theoretical biologist. Turing was highly influential in the development of theoretical co ...
, who had been alerted to a problem of mathematical logic by the lectures of
Max Newman Maxwell Herman Alexander Newman, FRS, (7 February 1897 – 22 February 1984), generally known as Max Newman, was a British mathematician and codebreaker. His work in World War II led to the construction of Colossus, the world's first operatio ...
at the
University of Cambridge The University of Cambridge is a public collegiate research university in Cambridge, England. Founded in 1209 and granted a royal charter by Henry III in 1231, Cambridge is the world's third oldest surviving university and one of its most pr ...
, wrote a paper in 1936 entitled ''On Computable Numbers, with an Application to the
Entscheidungsproblem In mathematics and computer science, the ' (, ) is a challenge posed by David Hilbert and Wilhelm Ackermann in 1928. The problem asks for an algorithm that considers, as input, a statement and answers "Yes" or "No" according to whether the state ...
'', which was published in the ''Proceedings of the London Mathematical Society''. and . In it he described a hypothetical machine he called a ''universal computing machine'', now known as the "
Universal Turing machine In computer science, a universal Turing machine (UTM) is a Turing machine that can simulate an arbitrary Turing machine on arbitrary input. The universal machine essentially achieves this by reading both the description of the machine to be simu ...
". The hypothetical machine had an infinite store (memory in today's terminology) that contained both instructions and data.
John von Neumann John von Neumann (; hu, Neumann János Lajos, ; December 28, 1903 – February 8, 1957) was a Hungarian-American mathematician, physicist, computer scientist, engineer and polymath. He was regarded as having perhaps the widest cove ...
became acquainted with Turing while he was a visiting professor at Cambridge in 1935, and also during Turing's PhD year at the
Institute for Advanced Study The Institute for Advanced Study (IAS), located in Princeton, New Jersey, in the United States, is an independent center for theoretical research and intellectual inquiry. It has served as the academic home of internationally preeminent schola ...
in
Princeton, New Jersey Princeton is a municipality with a borough form of government in Mercer County, in the U.S. state of New Jersey. It was established on January 1, 2013, through the consolidation of the Borough of Princeton and Princeton Township, both of whi ...
during 1936–1937. Whether he knew of Turing's paper of 1936 at that time is not clear. In 1936,
Konrad Zuse Konrad Ernst Otto Zuse (; 22 June 1910 – 18 December 1995) was a German civil engineer, pioneering computer scientist, inventor and businessman. His greatest achievement was the world's first programmable computer; the functional program ...
also anticipated, in two patent applications, that machine instructions could be stored in the same storage used for data. Independently,
J. Presper Eckert John Adam Presper Eckert Jr. (April 9, 1919 – June 3, 1995) was an American electrical engineer and computer pioneer. With John Mauchly, he designed the first general-purpose electronic digital computer (ENIAC), presented the first course in c ...
and
John Mauchly John William Mauchly (August 30, 1907 – January 8, 1980) was an American physicist who, along with J. Presper Eckert, designed ENIAC, the first general-purpose electronic digital computer, as well as EDVAC, BINAC and UNIVAC I, the first ...
, who were developing the
ENIAC ENIAC (; Electronic Numerical Integrator and Computer) was the first programmable, electronic, general-purpose digital computer, completed in 1945. There were other computers that had these features, but the ENIAC had all of them in one pac ...
at the
Moore School of Electrical Engineering The Moore School of Electrical Engineering at the University of Pennsylvania came into existence as a result of an endowment from Alfred Fitler Moore on June 4, 1923. It was granted to Penn's School of Electrical Engineering, located in the Towne ...
of the
University of Pennsylvania The University of Pennsylvania (also known as Penn or UPenn) is a private research university in Philadelphia. It is the fourth-oldest institution of higher education in the United States and is ranked among the highest-regarded universitie ...
, wrote about the stored-program concept in December 1943. In planning a new machine, EDVAC, Eckert wrote in January 1944 that they would store data and programs in a new addressable memory device, a mercury metal
delay-line memory Delay-line memory is a form of computer memory, now obsolete, that was used on some of the earliest digital computers. Like many modern forms of electronic computer memory, delay-line memory was a refreshable memory, but as opposed to modern ran ...
. This was the first time the construction of a practical stored-program machine was proposed. At that time, he and Mauchly were not aware of Turing's work. Von Neumann was involved in the
Manhattan Project The Manhattan Project was a research and development undertaking during World War II that produced the first nuclear weapons. It was led by the United States with the support of the United Kingdom and Canada. From 1942 to 1946, the project w ...
at the
Los Alamos National Laboratory Los Alamos National Laboratory (often shortened as Los Alamos and LANL) is one of the sixteen research and development laboratories of the United States Department of Energy (DOE), located a short distance northwest of Santa Fe, New Mexico, ...
. It required huge amounts of calculation, and thus drew him to the ENIAC project, during the summer of 1944. There he joined the ongoing discussions on the design of this stored-program computer, the EDVAC. As part of that group, he wrote up a description titled ''First Draft of a Report on the EDVAC'' based on the work of Eckert and Mauchly. It was unfinished when his colleague Herman Goldstine circulated it, and bore only von Neumann's name (to the consternation of Eckert and Mauchly). The paper was read by dozens of von Neumann's colleagues in America and Europe, and influenced the next round of computer designs.
Jack Copeland Brian John Copeland (born 1950) is Professor of Philosophy at the University of Canterbury, Christchurch, New Zealand, and author of books on the computing pioneer Alan Turing. Education Copeland was educated at the University of Oxford, obta ...
considers that it is "historically inappropriate to refer to electronic stored-program digital computers as 'von Neumann machines. His Los Alamos colleague
Stan Frankel Stanley Phillips Frankel (1919 – May, 1978) was an American computer scientist. He worked in the Manhattan Project and developed various computers as a consultant. Early life He was born in Los Angeles, attended graduate school at the Univers ...
said of von Neumann's regard for Turing's ideas At the time that the "First Draft" report was circulated, Turing was producing a report entitled ''Proposed Electronic Calculator''. It described in engineering and programming detail, his idea of a machine he called the '' Automatic Computing Engine (ACE)''. He presented this to the executive committee of the British National Physical Laboratory on February 19, 1946. Although Turing knew from his wartime experience at Bletchley Park that what he proposed was feasible, the secrecy surrounding Colossus, that was subsequently maintained for several decades, prevented him from saying so. Various successful implementations of the ACE design were produced. Both von Neumann's and Turing's papers described stored-program computers, but von Neumann's earlier paper achieved greater circulation and the computer architecture it outlined became known as the "von Neumann architecture". In the 1953 publication ''Faster than Thought: A Symposium on Digital Computing Machines'' (edited by B. V. Bowden), a section in the chapter on ''Computers in America'' reads as follows:
The Machine of the Institute For Advanced Studies, Princeton In 1945, Professor J. von Neumann, who was then working at the Moore School of Engineering in Philadelphia, where the E.N.I.A.C. had been built, issued on behalf of a group of his co-workers, a report on the logical design of digital computers. The report contained a detailed proposal for the design of the machine that has since become known as the E.D.V.A.C. (electronic discrete variable automatic computer). This machine has only recently been completed in America, but the von Neumann report inspired the construction of the E.D.S.A.C. (electronic delay-storage automatic calculator) in Cambridge (see page 130). In 1947, Burks, Goldstine and von Neumann published another report that outlined the design of another type of machine (a parallel machine this time) that would be exceedingly fast, capable perhaps of 20,000 operations per second. They pointed out that the outstanding problem in constructing such a machine was the development of suitable memory with instantaneously accessible contents. At first they suggested using a special
vacuum tube A vacuum tube, electron tube, valve (British usage), or tube (North America), is a device that controls electric current flow in a high vacuum between electrodes to which an electric potential difference has been applied. The type known as ...
—called the " Selectron"—which the Princeton Laboratories of RCA had invented. These tubes were expensive and difficult to make, so von Neumann subsequently decided to build a machine based on the Williams memory. This machine—completed in June, 1952 in Princeton—has become popularly known as the Maniac. The design of this machine inspired at least half a dozen machines now being built in America, all known affectionately as "Johniacs".
In the same book, the first two paragraphs of a chapter on ACE read as follows:
Automatic Computation at the National Physical Laboratory One of the most modern digital computers which embodies developments and improvements in the technique of automatic electronic computing was recently demonstrated at the National Physical Laboratory, Teddington, where it has been designed and built by a small team of mathematicians and electronics research engineers on the staff of the Laboratory, assisted by a number of production engineers from the English Electric Company, Limited. The equipment so far erected at the Laboratory is only the pilot model of a much larger installation which will be known as the Automatic Computing Engine, but although comparatively small in bulk and containing only about 800 thermionic valves, as can be judged from Plates XII, XIII and XIV, it is an extremely rapid and versatile calculating machine. The basic concepts and abstract principles of computation by a machine were formulated by Dr. A. M. Turing, F.R.S., in a paper1. read before the London Mathematical Society in 1936, but work on such machines in Britain was delayed by the war. In 1945, however, an examination of the problems was made at the National Physical Laboratory by Mr. J. R. Womersley, then superintendent of the Mathematics Division of the Laboratory. He was joined by Dr. Turing and a small staff of specialists, and, by 1947, the preliminary planning was sufficiently advanced to warrant the establishment of the special group already mentioned. In April, 1948, the latter became the Electronics Section of the Laboratory, under the charge of Mr. F. M. Colebrook.


Early von Neumann-architecture computers

The ''First Draft'' described a design that was used by many universities and corporations to construct their computers. Among these various computers, only ILLIAC and ORDVAC had compatible instruction sets. * ARC2 ( Birkbeck, University of London) officially came online on May 12, 1948. *
Manchester Baby The Manchester Baby, also called the Small-Scale Experimental Machine (SSEM), was the first electronic stored-program computer. It was built at the University of Manchester by Frederic C. Williams, Tom Kilburn, and Geoff Tootill, and ran its ...
(Victoria University of Manchester, England) made its first successful run of a stored program on June 21, 1948. * EDSAC (University of Cambridge, England) was the first practical stored-program electronic computer (May 1949) *
Manchester Mark 1 The Manchester Mark 1 was one of the earliest stored-program computers, developed at the Victoria University of Manchester, England from the Manchester Baby (operational in June 1948). Work began in August 1948, and the first version was oper ...
(University of Manchester, England) Developed from the Baby (June 1949) *
CSIRAC CSIRAC (; ''Commonwealth Scientific and Industrial Research Automatic Computer''), originally known as CSIR Mk 1, was Australia's first digital computer, and the fifth stored program computer in the world. It is the oldest surviving first-gener ...
( Council for Scientific and Industrial Research) Australia (November 1949) *
MESM MESM (Ukrainian: MEOM, Мала Електронна Обчислювальна Машина; Russian: МЭСМ, Малая Электронно-Счетная Машина; 'Small Electronic Calculating Machine') was the first universally program ...
in Kyiv, Ukraine (November 1950) * EDVAC (
Ballistic Research Laboratory The Ballistic Research Laboratory (BRL) was a leading U.S. Army research establishment situated at Aberdeen Proving Ground, Maryland that specialized in ballistics ( interior, exterior, and terminal) as well as vulnerability and lethality analys ...
, Computing Laboratory at
Aberdeen Proving Ground Aberdeen Proving Ground (APG) (sometimes erroneously called Aberdeen Proving ''Grounds'') is a U.S. Army facility located adjacent to Aberdeen, Harford County, Maryland, United States. More than 7,500 civilians and 5,000 military personnel work a ...
1951) *
ORDVAC The ORDVAC (''Ordnance Discrete Variable Automatic Computer)'', is an early computer built by the University of Illinois for the Ballistic Research Laboratory at Aberdeen Proving Ground. A successor to the ENIAC (along with EDVAC built earlier). ...
(U-Illinois) at Aberdeen Proving Ground, Maryland (completed November 1951) *
IAS machine The IAS machine was the first electronic computer built at the Institute for Advanced Study (IAS) in Princeton, New Jersey. It is sometimes called the von Neumann machine, since the paper describing its design was edited by John von Neumann, a ...
at Princeton University (January 1952) *
MANIAC I __NOTOC__ The MANIAC I (''Mathematical Analyzer Numerical Integrator and Automatic Computer Model I'') was an early computer built under the direction of Nicholas Metropolis at the Los Alamos Scientific Laboratory. It was based on the von Neuma ...
at
Los Alamos Scientific Laboratory Los Alamos National Laboratory (often shortened as Los Alamos and LANL) is one of the sixteen research and development laboratories of the United States Department of Energy (DOE), located a short distance northwest of Santa Fe, New Mexico, in ...
(March 1952) *
ILLIAC ILLIAC (Illinois Automatic Computer) was a series of supercomputers built at a variety of locations, some at the University of Illinois at Urbana–Champaign. In all, five computers were built in this series between 1951 and 1974. Some more modern ...
at the University of Illinois, (September 1952) * BESM-1 in Moscow (1952) *
AVIDAC The AVIDAC or ''Argonne Version of the Institute's Digital Automatic Computer'', an early computer built by Argonne National Laboratory, was partially based on the IAS architecture developed by John von Neumann. It was built by the Laboratory's P ...
at Argonne National Laboratory (1953) * ORACLE at
Oak Ridge National Laboratory Oak Ridge National Laboratory (ORNL) is a U.S. multiprogram science and technology national laboratory sponsored by the U.S. Department of Energy (DOE) and administered, managed, and operated by UT–Battelle as a federally funded research an ...
(June 1953) *
BESK BESK (''Binär Elektronisk SekvensKalkylator'', Swedish for "Binary Electronic Sequence Calculator") was Sweden's first electronic computer, using vacuum tubes instead of relays. It was developed by '' Matematikmaskinnämnden'' ( Swedish Boar ...
in Stockholm (1953) *
JOHNNIAC The JOHNNIAC was an early computer built by the RAND Corporation (not Remington Rand, maker of the contemporaneous UNIVAC I computer) and based on the von Neumann architecture that had been pioneered on the IAS machine. It was named in honor of ...
at RAND Corporation (January 1954) *
DASK The DASK was the first computer in Denmark. It was commissioned in 1955, designed and constructed by Regnecentralen, and began operation in September 1957. DASK is an acronym for Dansk Aritmetisk Sekvens Kalkulator or ''Danish Arithmetic Sequence ...
in Denmark (1955) *
WEIZAC WEIZAC (''Weizmann Automatic Computer'') was the first computer in Israel, and one of the first large-scale, stored-program, electronic computers in the world. It was built at the Weizmann Institute during 1954–1955, based on the Institute for ...
at the
Weizmann Institute of Science The Weizmann Institute of Science ( he, מכון ויצמן למדע ''Machon Vaitzman LeMada'') is a public research university in Rehovot, Israel, established in 1934, 14 years before the State of Israel. It differs from other Israeli unive ...
in Rehovot, Israel (1955) *
PERM Perm or PERM may refer to: Places *Perm, Russia, a city in Russia ** Permsky District, the district **Perm Krai, a federal subject of Russia since 2005 **Perm Oblast, a former federal subject of Russia 1938–2005 **Perm Governorate, an administra ...
in Munich (1956) *
SILLIAC The SILLIAC (''Sydney version of the Illinois Automatic Computer'', i.e. the ''Sydney ILLIAC''), an early computer built by the University of Sydney, Australia, was based on the ILLIAC and ORDVAC computers developed at the University of Illin ...
in Sydney (1956)


Early stored-program computers

The date information in the following chronology is difficult to put into proper order. Some dates are for first running a test program, some dates are the first time the computer was demonstrated or completed, and some dates are for the first delivery or installation. * The IBM SSEC had the ability to treat instructions as data, and was publicly demonstrated on January 27, 1948. This ability was claimed in a US patent. However it was partially electromechanical, not fully electronic. In practice, instructions were read from paper tape due to its limited memory.. * The ARC2 developed by Andrew Booth and
Kathleen Booth Kathleen Hylda Valerie Booth ( Britten, 9 July 1922 – 29 September 2022) was a British computer scientist and mathematician who wrote the first assembly language and designed the assembler and autocode for the first computer systems at Birkbe ...
at Birkbeck, University of London officially came online on May 12, 1948. It featured the first rotating drum storage device. * The
Manchester Baby The Manchester Baby, also called the Small-Scale Experimental Machine (SSEM), was the first electronic stored-program computer. It was built at the University of Manchester by Frederic C. Williams, Tom Kilburn, and Geoff Tootill, and ran its ...
was the first fully electronic computer to run a stored program. It ran a factoring program for 52 minutes on June 21, 1948, after running a simple division program and a program to show that two numbers were
relatively prime In mathematics, two integers and are coprime, relatively prime or mutually prime if the only positive integer that is a divisor of both of them is 1. Consequently, any prime number that divides does not divide , and vice versa. This is equivale ...
. * The
ENIAC ENIAC (; Electronic Numerical Integrator and Computer) was the first programmable, electronic, general-purpose digital computer, completed in 1945. There were other computers that had these features, but the ENIAC had all of them in one pac ...
was modified to run as a primitive read-only stored-program computer (using the Function Tables for program
ROM Rom, or ROM may refer to: Biomechanics and medicine * Risk of mortality, a medical classification to estimate the likelihood of death for a patient * Rupture of membranes, a term used during pregnancy to describe a rupture of the amniotic sac * ...
) and was demonstrated as such on September 16, 1948, running a program by Adele Goldstine for von Neumann. * The
BINAC BINAC (Binary Automatic Computer) was an early electronic computer designed for Northrop Aircraft Company by the Eckert–Mauchly Computer Corporation (EMCC) in 1949. Eckert and Mauchly, though they had started the design of EDVAC at the Unive ...
ran some test programs in February, March, and April 1949, although was not completed until September 1949. * The
Manchester Mark 1 The Manchester Mark 1 was one of the earliest stored-program computers, developed at the Victoria University of Manchester, England from the Manchester Baby (operational in June 1948). Work began in August 1948, and the first version was oper ...
developed from the Baby project. An intermediate version of the Mark 1 was available to run programs in April 1949, but was not completed until October 1949. * The EDSAC ran its first program on May 6, 1949. * The EDVAC was delivered in August 1949, but it had problems that kept it from being put into regular operation until 1951. * The CSIR Mk I ran its first program in November 1949. * The SEAC was demonstrated in April 1950. * The
Pilot ACE The Pilot ACE (Automatic Computing Engine) was one of the first computers built in the United Kingdom. Built at the National Physical Laboratory (NPL) in the early 1950s, it was also one of the earliest general-purpose, stored-program computers ...
ran its first program on May 10, 1950, and was demonstrated in December 1950. * The SWAC was completed in July 1950. * The
Whirlwind A whirlwind is a weather phenomenon in which a vortex of wind (a vertically oriented rotating column of air) forms due to instabilities and turbulence created by heating and flow (current) gradients. Whirlwinds occur all over the world and i ...
was completed in December 1950 and was in actual use in April 1951. * The first ERA Atlas (later the commercial ERA 1101/UNIVAC 1101) was installed in December 1950.


Evolution

Through the decades of the 1960s and 1970s computers generally became both smaller and faster, which led to evolutions in their architecture. For example,
memory-mapped I/O Memory-mapped I/O (MMIO) and port-mapped I/O (PMIO) are two complementary methods of performing input/output (I/O) between the central processing unit (CPU) and peripheral devices in a computer. An alternative approach is using dedicated I/O pr ...
lets input and output devices be treated the same as memory. A single system bus could be used to provide a modular system with lower cost. This is sometimes called a "streamlining" of the architecture. In subsequent decades, simple
microcontrollers A microcontroller (MCU for ''microcontroller unit'', often also MC, UC, or μC) is a small computer on a single VLSI integrated circuit (IC) chip. A microcontroller contains one or more CPUs ( processor cores) along with memory and programmabl ...
would sometimes omit features of the model to lower cost and size. Larger computers added features for higher performance.


Design limitations


Von Neumann bottleneck

The shared bus between the program memory and data memory leads to the ''von Neumann bottleneck'', the limited
throughput Network throughput (or just throughput, when in context) refers to the rate of message delivery over a communication channel, such as Ethernet or packet radio, in a communication network. The data that these messages contain may be delivered ove ...
(data transfer rate) between the
central processing unit A central processing unit (CPU), also called a central processor, main processor or just Processor (computing), processor, is the electronic circuitry that executes Instruction (computing), instructions comprising a computer program. The CPU per ...
(CPU) and memory compared to the amount of memory. Because the single bus can only access one of the two classes of memory at a time, throughput is lower than the rate at which the CPU can work. This seriously limits the effective processing speed when the CPU is required to perform minimal processing on large amounts of data. The CPU is continually forced to wait for needed data to move to or from memory. Since CPU speed and memory size have increased much faster than the throughput between them, the bottleneck has become more of a problem, a problem whose severity increases with every new generation of CPU. The von Neumann bottleneck was described by
John Backus John Warner Backus (December 3, 1924 – March 17, 2007) was an American computer scientist. He directed the team that invented and implemented FORTRAN, the first widely used high-level programming language, and was the inventor of the Backu ...
in his 1977 ACM
Turing Award The ACM A. M. Turing Award is an annual prize given by the Association for Computing Machinery (ACM) for contributions of lasting and major technical importance to computer science. It is generally recognized as the highest distinction in comput ...
lecture. According to Backus:
Surely there must be a less primitive way of making big changes in the store than by pushing vast numbers of
words A word is a basic element of language that carries an objective or practical meaning, can be used on its own, and is uninterruptible. Despite the fact that language speakers often have an intuitive grasp of what a word is, there is no conse ...
back and forth through the von Neumann bottleneck. Not only is this tube a literal bottleneck for the data traffic of a problem, but, more importantly, it is an intellectual bottleneck that has kept us tied to word-at-a-time thinking instead of encouraging us to think in terms of the larger conceptual units of the task at hand. Thus programming is basically planning and detailing the enormous traffic of words through the von Neumann bottleneck, and much of that traffic concerns not significant data itself, but where to find it.


Mitigations

There are several known methods for mitigating the Von Neumann performance bottleneck. For example, the following all can improve performance: * Providing a
cache Cache, caching, or caché may refer to: Places United States * Cache, Idaho, an unincorporated community * Cache, Illinois, an unincorporated community * Cache, Oklahoma, a city in Comanche County * Cache, Utah, Cache County, Utah * Cache County ...
between the CPU and the main memory * providing separate caches or separate access paths for data and instructions (the so-called
Modified Harvard architecture The modified Harvard architecture is a variation of the Harvard computer architecture that, unlike the pure Harvard architecture, allows the contents of the instruction memory to be accessed as data. Most modern computers that are documented as ...
) * using
branch predictor In computer architecture, a branch predictor is a digital circuit that tries to guess which way a branch (e.g., an if–then–else structure) will go before this is known definitively. The purpose of the branch predictor is to improve the flow ...
algorithms and logic * providing a limited CPU stack or other on-chip
scratchpad memory Scratchpad memory (SPM), also known as scratchpad, scratchpad RAM or local store in computer terminology, is a high-speed internal memory used for temporary storage of calculations, data, and other work in progress. In reference to a microprocess ...
to reduce memory access * Implementing the CPU and the memory hierarchy as a system on chip, providing greater
locality of reference In computer science, locality of reference, also known as the principle of locality, is the tendency of a processor to access the same set of memory locations repetitively over a short period of time. There are two basic types of reference localit ...
and thus reducing latency and increasing throughput between processor registers and main memory The problem can also be sidestepped somewhat by using parallel computing, using for example the
non-uniform memory access Non-uniform memory access (NUMA) is a computer memory design used in multiprocessing, where the memory access time depends on the memory location relative to the processor. Under NUMA, a processor can access its own local memory faster than non ...
(NUMA) architecture—this approach is commonly employed by supercomputers. It is less clear whether the ''intellectual bottleneck'' that Backus criticized has changed much since 1977. Backus's proposed solution has not had a major influence. Modern
functional programming In computer science, functional programming is a programming paradigm where programs are constructed by applying and composing functions. It is a declarative programming paradigm in which function definitions are trees of expressions that ...
and
object-oriented programming Object-oriented programming (OOP) is a programming paradigm based on the concept of "objects", which can contain data and code. The data is in the form of fields (often known as attributes or ''properties''), and the code is in the form of ...
are much less geared towards "pushing vast numbers of words back and forth" than earlier languages like FORTRAN were, but internally, that is still what computers spend much of their time doing, even highly parallel supercomputers. As of 1996, a database benchmark study found that three out of four CPU cycles were spent waiting for memory. Researchers expect that increasing the number of simultaneous instruction streams with multithreading or single-chip multiprocessing will make this bottleneck even worse.Sites, Richard L.; Patt, Yale
"Architects Look to Processors of Future"
Microprocessor report. 1996.
In the context of
multi-core processor A multi-core processor is a microprocessor on a single integrated circuit with two or more separate processing units, called cores, each of which reads and executes program instructions. The instructions are ordinary CPU instructions (such ...
s, additional overhead is required to maintain cache coherence between processors and threads.


Self-modifying code

Aside from the von Neumann bottleneck, program modifications can be quite harmful, either by accident or design. In some simple stored-program computer designs, a malfunctioning program can damage itself, other programs, or the
operating system An operating system (OS) is system software that manages computer hardware, software resources, and provides common services for computer programs. Time-sharing operating systems schedule tasks for efficient use of the system and may also i ...
, possibly leading to a computer
crash Crash or CRASH may refer to: Common meanings * Collision, an impact between two or more objects * Crash (computing), a condition where a program ceases to respond * Cardiac arrest, a medical condition in which the heart stops beating * Couch su ...
.
Memory protection Memory protection is a way to control memory access rights on a computer, and is a part of most modern instruction set architectures and operating systems. The main purpose of memory protection is to prevent a process from accessing memory that h ...
and other forms of access control can usually protect against both accidental and malicious program changes.


See also

* CARDboard Illustrative Aid to Computation *
Interconnect bottleneck The interconnect bottleneck comprises limits on integrated circuit (IC) performance due to connections between components instead of their internal speed. In 2006 it was predicted to be a "looming crisis" by 2010. Improved performance of compute ...
* Little man computer *
Random-access machine In computer science, random-access machine (RAM) is an abstract machine in the general class of register machines. The RAM is very similar to the counter machine but with the added capability of 'indirect addressing' of its registers. Like the cou ...
*
Harvard architecture The Harvard architecture is a computer architecture with separate storage and signal pathways for instructions and data. It contrasts with the von Neumann architecture, where program instructions and data share the same memory and pathways. ...
*
Turing machine A Turing machine is a mathematical model of computation describing an abstract machine that manipulates symbols on a strip of tape according to a table of rules. Despite the model's simplicity, it is capable of implementing any computer algori ...
* Eckert architecture


References


Further reading

* * * republished as: * ''Can Programming be Liberated from the von Neumann Style?''. Backus, John. 1977 ACM Turing Award Lecture. Communications of the ACM, August 1978, Volume 21, Number
Online PDF
see details at https://www.cs.tufts.edu/~nr/backus-lecture.html * Bell, C. Gordon; Newell, Allen (1971), ''Computer Structures: Readings and Examples'',
McGraw-Hill Book Company McGraw Hill is an American educational publishing company and one of the "big three" educational publishers that publishes educational content, software, and services for pre-K through postgraduate education. The company also publishes referen ...
, New York. Massive (668 pages) * * * * *


External links


Harvard vs von Neumann

A tool that emulates the behavior of a von Neumann machine

JOHNNY: A simple Open Source simulator of a von Neumann machine for educational purposes
{{DEFAULTSORT:Von Neumann Architecture Computer architecture Flynn's taxonomy Reference models Classes of computers Department of Computer Science, University of Manchester Computer-related introductions in 1945 John von Neumann