Metacompilation
   HOME





Metacompilation
Metacompilation is a computation which involves metasystem transitions (MST) from a computing machine ''M'' to a metamachine ''M' '' which controls, analyzes and imitates the work of ''M''. Semantics-based program transformation, such as partial evaluation and supercompilation (SCP), is metacomputation. Metasystem transitions may be repeated, as when a program transformer gets transformed itself. In this manner MST hierarchies of any height can be formed. The Fox paper reviews one strain of research which was started in Russia by Valentin Turchin's REFAL system in the late 1960s-early 1970s and became known for the development of supercompilation as a distinct method of program transformation. After a brief description of the history of this research line, the paper concentrates on those results and problems where supercompilation is combined with repeated metasystem transitions. See also *Metacompiler *Partial evaluation In computing, partial evaluation is a technique for se ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


Metacompiler
In computer science, a compiler-compiler or compiler generator is a programming tool that creates a parser, interpreter, or compiler from some form of formal description of a programming language and machine. The most common type of compiler-compiler is called a parser generator. It handles only syntactic analysis. A formal description of a language is usually a grammar used as an input to a parser generator. It often resembles Backus–Naur form (BNF), extended Backus–Naur form (EBNF), or has its own syntax. Grammar files describe a syntax of a generated compiler's target programming language and actions that should be taken against its specific constructs. Source code for a parser of the programming language is returned as the parser generator's output. This source code can then be compiled into a parser, which may be either standalone or embedded. The compiled parser then accepts the source code of the target programming language as an input and performs an action or ou ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


Computation
A computation is any type of arithmetic or non-arithmetic calculation that is well-defined. Common examples of computation are mathematical equation solving and the execution of computer algorithms. Mechanical or electronic devices (or, historically, people) that perform computations are known as ''computers''. Computer science is an academic field that involves the study of computation. Introduction The notion that mathematical statements should be 'well-defined' had been argued by mathematicians since at least the 1600s, but agreement on a suitable definition proved elusive. A candidate definition was proposed independently by several mathematicians in the 1930s. The best-known variant was formalised by the mathematician Alan Turing, who defined a well-defined statement or calculation as any statement that could be expressed in terms of the initialisation parameters of a Turing machine. Other (mathematically equivalent) definitions include Alonzo Church's '' lambda-defin ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


Metasystem
A metasystem or meta-system is a "system about other systems", such as describing, generalizing, modelling, or analyzing the other system(s). It links the concepts of a system A system is a group of interacting or interrelated elements that act according to a set of rules to form a unified whole. A system, surrounded and influenced by its open system (systems theory), environment, is described by its boundaries, str ... and meta. References {{Systemstheory-stub Control theory ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


picture info

Semantic
Semantics is the study of linguistic Meaning (philosophy), meaning. It examines what meaning is, how words get their meaning, and how the meaning of a complex expression depends on its parts. Part of this process involves the distinction between sense and reference. Sense is given by the ideas and concepts associated with an expression while reference is the object to which an expression points. Semantics contrasts with syntax, which studies the rules that dictate how to create grammatically correct sentences, and pragmatics, which investigates how people use language in communication. Lexical semantics is the branch of semantics that studies word meaning. It examines whether words have one or several meanings and in what lexical relations they stand to one another. Phrasal semantics studies the meaning of sentences by exploring the phenomenon of compositionality or how new meanings can be created by arranging words. Formal semantics (natural language), Formal semantics relies o ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]




Partial Evaluation
In computing, partial evaluation is a technique for several different types of program optimization by specialization. The most straightforward application is to produce new programs that run faster than the originals while being guaranteed to behave in the same way. A computer program ''prog'' is seen as a mapping of input data into output data: prog : I_\text \times I_\text \to O, where I_\text, the ''static data'', is the part of the input data known at compile time. The partial evaluator transforms \langle prog, I_\text\rangle into prog^* : I_\text \to O by precomputing all static input at compile time. prog^* is called the "residual program" and should run more efficiently than the original program. The act of partial evaluation is said to "residualize" prog to prog^*. Futamura projections A particularly interesting example of the use of partial evaluation, first described in the 1970s by Yoshihiko Futamura, is when ''prog'' is an interpreter for a programming language ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


Russia
Russia, or the Russian Federation, is a country spanning Eastern Europe and North Asia. It is the list of countries and dependencies by area, largest country in the world, and extends across Time in Russia, eleven time zones, sharing Borders of Russia, land borders with fourteen countries. Russia is the List of European countries by population, most populous country in Europe and the List of countries and dependencies by population, ninth-most populous country in the world. It is a Urbanization by sovereign state, highly urbanised country, with sixteen of its urban areas having more than 1 million inhabitants. Moscow, the List of metropolitan areas in Europe, most populous metropolitan area in Europe, is the capital and List of cities and towns in Russia by population, largest city of Russia, while Saint Petersburg is its second-largest city and Society and culture in Saint Petersburg, cultural centre. Human settlement on the territory of modern Russia dates back to the ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


Valentin Turchin
Valentin Fyodorovich Turchin (, 14 February 1931 – 7 April 2010) was a Soviet and American physicist, cybernetician, and computer scientist. He developed the Refal programming language, the theory of metasystem transitions and the notion of supercompilation. He was a pioneer in artificial intelligence and a proponent of the global brain hypothesis. Biography Turchin was born in 1931 in Podolsk, Soviet Union. In 1952, he graduated from Moscow University with a degree in Theoretical Physics and got his Ph.D. in 1957. After working on neutron and solid-state physics at the Institute for Physics of Energy in Obninsk, in 1964 he accepted a position at the Keldysh Institute of Applied Mathematics in Moscow. There he worked on statistical regularization methods and authored REFAL, one of the first AI languages and the AI language of choice in the Soviet Union. In the 1960s, Turchin became politically active. In the Fall of 1968, he wrote the pamphlet ''The Inertia of Fear'', ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


REFAL
Refal ("Recursive functions algorithmic language"; ) "is a functional programming language oriented toward symbolic computations", including " string processing, language translation, ndartificial intelligence". It is one of the oldest members of this family, first conceived of in 1966 as a theoretical tool, with the first implementation appearing in 1968. Refal was intended to combine mathematical simplicity with practicality for writing large and sophisticated programs. One of the first functional programming languages to do so, and unlike Lisp of its time, Refal is based on pattern matching. Its pattern matching works in conjunction with term rewriting. The basic data structure of Lisp and Prolog is a linear list built by cons operation in a sequential manner, thus with ''O(n)'' access to list's ''n''th element. Refal's lists are built and scanned from both ends, with pattern matching working for nested lists as well as the top-level one. In effect, the basic data structu ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]




Program Transformation
A program transformation is any operation that takes a computer program and generates another program. In many cases the transformed program is required to be semantically equivalent to the original, relative to a particular Formal semantics of programming languages, formal semantics and in fewer cases the transformations result in programs that semantically differ from the original in predictable ways. While the transformations can be performed manually, it is often more practical to use a List of Program Transformation Systems, program transformation system that applies specifications of the required transformations. Program transformations may be specified as automated procedures that modify compiler data structures (e.g. abstract syntax trees) representing the program text, or may be specified more conveniently using patterns or templates representing parameterized source code fragments. A practical requirement for source code transformation systems is that they be able to ef ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


Partial Evaluation
In computing, partial evaluation is a technique for several different types of program optimization by specialization. The most straightforward application is to produce new programs that run faster than the originals while being guaranteed to behave in the same way. A computer program ''prog'' is seen as a mapping of input data into output data: prog : I_\text \times I_\text \to O, where I_\text, the ''static data'', is the part of the input data known at compile time. The partial evaluator transforms \langle prog, I_\text\rangle into prog^* : I_\text \to O by precomputing all static input at compile time. prog^* is called the "residual program" and should run more efficiently than the original program. The act of partial evaluation is said to "residualize" prog to prog^*. Futamura projections A particularly interesting example of the use of partial evaluation, first described in the 1970s by Yoshihiko Futamura, is when ''prog'' is an interpreter for a programming language ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]