Structured Program Theorem
The structured program theorem, also called the Böhm–Jacopini theorem, is a result in programming language theory. It states that a class of control-flow graphs (historically called flowcharts in this context) can compute any computable function if it combines subprograms in only three specific ways ( control structures). These are #Executing one subprogram, and then another subprogram (sequence) #Executing one of two subprograms according to the value of a boolean expression (selection) #Repeatedly executing a subprogram as long as a boolean expression is true (iteration) The structured chart subject to these constraints, particularly the loop constraint implying a single exit (as described later in this article), may however use additional variables in the form of bits (stored in an extra integer variable in the original proof) in order to keep track of information that the original program represents by the program location. The construction was based on Böhm's programmin ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Dexter Kozen
Dexter Campbell Kozen (born December 20, 1951) is an American theoretical computer scientist. He is Professor Emeritus and Joseph Newton Pew, Jr. Professor in Engineering at Cornell University. Career Kozen received his BA in mathematics from Dartmouth College in 1974 and his PhD in computer science in 1977 from Cornell University, where he was advised by Juris Hartmanis on the thesis, ''Complexity of Finitely Presented Algebras''. He is known for his work at the intersection of logic and complexity. He is one of the fathers of dynamic logic and developed the version of the modal μ-calculus most used today. His work on Kleene algebra with tests was recognized with an Alonzo Church Award in 2022. Moreover, he has written several textbooks on the theory of computation, automata theory, dynamic logic, and algorithms. Kozen was a guitarist, singer, and songwriter in the band "Harmful if Swallowed". He also holds the position of faculty advisor for Cornell's rugby football c ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Harlan Mills
Harlan D. Mills (May 14, 1919 – January 8, 1996) was professor of computer science at the Florida Institute of Technology and founder of Software Engineering Technology, Inc. of Vero Beach, Florida (since acquired by Q-Labs). Mills' contributions to software engineering have had a profound and enduring effect on education and industrial practice. Since earning his Ph.D. in Mathematics at Iowa State University in 1952, Mills led a distinguished career. As an IBM research fellow, Mills adapted existing ideas from engineering and computer science to software development. These included automata theory, the structured programming theory of Edsger Dijkstra, Robert W. Floyd, and others, and Markov chain-driven software testing. His Cleanroom software development process emphasized top-down design and formal specification. Mills contributed his ideas to the profession in six books and over fifty refereed articles in technical journals. Mills was termed a "super-programmer", ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Edsger Dijkstra
Edsger Wybe Dijkstra ( ; ; 11 May 1930 – 6 August 2002) was a Dutch computer scientist, programmer, software engineer, mathematician, and science essayist. Born in Rotterdam in the Netherlands, Dijkstra studied mathematics and physics and then theoretical physics at the University of Leiden. Adriaan van Wijngaarden offered him a job as the first computer programmer in the Netherlands at the Mathematical Centre in Amsterdam, where he worked from 1952 until 1962. He formulated and solved the shortest path problem in 1956, and in 1960 developed the first compiler for the programming language ALGOL 60 in conjunction with colleague Jaap A. Zonneveld. In 1962 he moved to Eindhoven, and later to Nuenen, where he became a professor in the Mathematics Department at the Technische Hogeschool Eindhoven. In the late 1960s he built the THE multiprogramming system, which influenced the designs of subsequent systems through its use of software-based paged virtual memory. Dijkstra joined ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Reversible Computing
Reversible computing is any model of computation where every step of the process is time-reversible. This means that, given the output of a computation, it's possible to perfectly reconstruct the input. In systems that progress deterministically from one state to another, a key requirement for reversibility is a one-to-one correspondence between each state and its successor. Reversible computing is considered an unconventional approach to computation and is closely linked to quantum computing, where the principles of quantum mechanics inherently ensure reversibility (as long as quantum states are not measured or " collapsed"). Reversibility There are two major, closely related types of reversibility that are of particular interest for this purpose: physical reversibility and logical reversibility. A process is said to be ''physically reversible'' if it results in no increase in physical entropy; it is isentropic. There is a style of circuit design ideally exhibiting thi ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Program Transformation
A program transformation is any operation that takes a computer program and generates another program. In many cases the transformed program is required to be semantically equivalent to the original, relative to a particular Formal semantics of programming languages, formal semantics and in fewer cases the transformations result in programs that semantically differ from the original in predictable ways. While the transformations can be performed manually, it is often more practical to use a List of Program Transformation Systems, program transformation system that applies specifications of the required transformations. Program transformations may be specified as automated procedures that modify compiler data structures (e.g. abstract syntax trees) representing the program text, or may be specified more conveniently using patterns or templates representing parameterized source code fragments. A practical requirement for source code transformation systems is that they be able to ef ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Subgraph Isomorphism Problem
In theoretical computer science, the subgraph isomorphism problem is a computational task in which two graphs G and H are given as input, and one must determine whether G contains a subgraph that is isomorphic to H. Subgraph isomorphism is a generalization of both the maximum clique problem and the problem of testing whether a graph contains a Hamiltonian cycle, and is therefore NP-complete. However certain other cases of subgraph isomorphism may be solved in polynomial time. Sometimes the name subgraph matching is also used for the same problem. This name puts emphasis on finding such a subgraph as opposed to the bare decision problem. Decision problem and computational complexity To prove subgraph isomorphism is NP-complete, it must be formulated as a decision problem. The input to the decision problem is a pair of graphs G and ''H''. The answer to the problem is positive if ''H'' is isomorphic to a subgraph of ''G'', and negative otherwise. Formal question: Let G=(V,E), ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Structural Induction
Structural induction is a proof method that is used in mathematical logic (e.g., in the proof of Łoś' theorem), computer science, graph theory, and some other mathematical fields. It is a generalization of mathematical induction over natural numbers and can be further generalized to arbitrary Noetherian induction. Structural recursion is a recursion method bearing the same relationship to structural induction as ordinary recursion bears to ordinary mathematical induction. Structural induction is used to prove that some proposition holds for all of some sort of recursively defined structure, such as formulas, lists, or trees. A well-founded partial order is defined on the structures ("subformula" for formulas, "sublist" for lists, and "subtree" for trees). The structural induction proof is a proof that the proposition holds for all the minimal structures and that if it holds for the immediate substructures of a certain structure , then it must hold for also. (Forma ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Pseudocode
In computer science, pseudocode is a description of the steps in an algorithm using a mix of conventions of programming languages (like assignment operator, conditional operator, loop) with informal, usually self-explanatory, notation of actions and conditions. Although pseudocode shares features with regular programming languages, it is intended for human reading rather than machine control. Pseudocode typically omits details that are essential for machine implementation of the algorithm, meaning that pseudocode can only be verified by hand. The programming language is augmented with natural language description details, where convenient, or with compact mathematical notation. The reasons for using pseudocode are that it is easier for people to understand than conventional programming language code and that it is an efficient and environment-independent description of the key principles of an algorithm. It is commonly used in textbooks and scientific publications to document ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Donald Knuth
Donald Ervin Knuth ( ; born January 10, 1938) is an American computer scientist and mathematician. He is a professor emeritus at Stanford University. He is the 1974 recipient of the ACM Turing Award, informally considered the Nobel Prize of computer science. Knuth has been called the "father of the analysis of algorithms". Knuth is the author of the multi-volume work '' The Art of Computer Programming''. He contributed to the development of the rigorous analysis of the computational complexity of algorithms and systematized formal mathematical techniques for it. In the process, he also popularized the asymptotic notation. In addition to fundamental contributions in several branches of theoretical computer science, Knuth is the creator of the TeX computer typesetting system, the related METAFONT font definition language and rendering system, and the Computer Modern family of typefaces. As a writer and scholar, Knuth created the WEB and CWEB computer programming systems des ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Kleene's T Predicate
In computability theory, the T predicate, first studied by mathematician Stephen Cole Kleene, is a particular set of triples of natural numbers that is used to represent computable functions within formal theories of arithmetic. Informally, the ''T'' predicate tells whether a particular computer program will halt when run with a particular input, and the corresponding ''U'' function is used to obtain the results of the computation if the program does halt. As with the smn theorem, the original notation used by Kleene has become standard terminology for the concept.The predicate described here was presented in (Kleene 1943) and (Kleene 1952), and this is what is usually called "Kleene's ''T'' predicate". (Kleene 1967) uses the letter ''T'' to describe a different predicate related to computable functions, but which cannot be used to obtain Kleene's normal form theorem. Definition The definition depends on a suitable Gödel numbering that assigns natural numbers to computable ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Stephen Kleene
Stephen Cole Kleene ( ; January 5, 1909 – January 25, 1994) was an American mathematician. One of the students of Alonzo Church, Kleene, along with Rózsa Péter, Alan Turing, Emil Post, and others, is best known as a founder of the branch of mathematical logic known as recursion theory, which subsequently helped to provide the foundations of theoretical computer science. Kleene's work grounds the study of computable functions. A number of mathematical concepts are named after him: Kleene hierarchy, Kleene algebra, the Kleene star (Kleene closure), Kleene's recursion theorem and the Kleene fixed-point theorem. He also invented regular expressions in 1951 to describe McCulloch-Pitts neural networks, and made significant contributions to the foundations of mathematical intuitionism. Biography Kleene was awarded a bachelor's degree from Amherst College in 1930. He was awarded a Ph.D. in mathematics from Princeton University in 1934, where his thesis, entitled ''A Theory of Po ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |