Program Refinement
Refinement is a generic term of computer science that encompasses various approaches for producing correct computer programs and simplifying existing programs to enable their formal verification. Program refinement In formal methods, program refinement is the verifiable transformation of an ''abstract'' (high-level) formal specification into a ''concrete'' (low-level) executable program. '' Stepwise refinement'' allows this process to be done in stages. Logically, refinement normally involves implication, but there can be additional complications. The progressive just-in-time preparation of the product backlog (requirements list) in agile software development approaches, such as Scrum, is also commonly described as refinement. Data refinement Data refinement is used to convert an abstract data model (in terms of sets for example) into implementable data structures (such as arrays). Operation refinement converts a specification of an operation on a system into an implementab ... [...More Info...] [...Related Items...] OR: [Wikipedia] [Google] [Baidu] |
|
Correctness (computer Science)
In theoretical computer science, an algorithm is correct with respect to a program specification, specification if it behaves as specified. Best explored is ''functional'' correctness, which refers to the input–output behavior of the algorithm: for each input it produces an output satisfying the specification. Within the latter notion, ''partial correctness'', requiring that ''if'' an answer is returned it will be correct, is distinguished from ''total correctness'', which additionally requires that an answer ''is'' eventually returned, i.e. the algorithm terminates. Correspondingly, to mathematical proof, prove a program's total correctness, it is sufficient to prove its partial correctness, and its termination. The latter kind of proof (termination proof) can never be fully automated, since the halting problem is undecidable problem, undecidable. For example, successively searching through integers 1, 2, 3, … to see if we can find an example of some phenomenon—say an ... [...More Info...] [...Related Items...] OR: [Wikipedia] [Google] [Baidu] |
|
Nondeterministic Algorithm
In computer science and computer programming, a nondeterministic algorithm is an algorithm that, even for the same input, can exhibit different behaviors on different runs, as opposed to a deterministic algorithm. Different models of computation give rise to different reasons that an algorithm may be non-deterministic, and different ways to evaluate its performance or correctness: *A concurrent algorithm can perform differently on different runs due to a race condition A race condition or race hazard is the condition of an electronics, software, or other system where the system's substantive behavior is dependent on the sequence or timing of other uncontrollable events, leading to unexpected or inconsistent .... This can happen even with a single-threaded algorithm when it interacts with resources external to it. In general, such an algorithm is considered to perform correctly only when ''all'' possible runs produce the desired results. *A probabilistic algorithm's behavior ... [...More Info...] [...Related Items...] OR: [Wikipedia] [Google] [Baidu] |
|
B-Method
The B method is a method of software development based on B, a tool-supported formal method based on an abstract machine notation, used in the development of computer software. Overview B was originally developed in the 1980s by Jean-Raymond Abrial in France and the UK. B is related to the Z notation (also originated by Abrial) and supports development of programming language code from specifications. B has been used in major safety-critical system applications in Europe (such as the automatic Paris Métro lines 14 and 1 and the Ariane 5 rocket). It has robust, commercially available tool support for specification, design, proof and code generation. Compared to Z, B is slightly more low-level and more focused on refinement to code rather than just formal specification — hence it is easier to correctly implement a specification written in B than one in Z. In particular, there is good tool support for this. The same language is used in specification, design and programming. ... [...More Info...] [...Related Items...] OR: [Wikipedia] [Google] [Baidu] |
|
![]() |
FermaT Transformation System
Pierre de Fermat (; ; 17 August 1601 – 12 January 1665) was a French mathematician who is given credit for early developments that led to infinitesimal calculus, including his technique of adequality. In particular, he is recognized for his discovery of an original method of finding the greatest and the smallest ordinates of curved lines, which is analogous to that of differential calculus, then unknown, and his research into number theory. He made notable contributions to analytic geometry, probability, and optics. He is best known for his Fermat's principle for light propagation and his Fermat's Last Theorem in number theory, which he described in a note at the margin of a copy of Diophantus' ''Arithmetica''. He was also a lawyer at the ''parlement'' of Toulouse, France. Biography Fermat was born in 1601 in Beaumont-de-Lomagne, France—the late 15th-century mansion where Fermat was born is now a museum. He was from Gascony, where his father, Dominique Fermat, was a wealthy ... [...More Info...] [...Related Items...] OR: [Wikipedia] [Google] [Baidu] |
Hoare Logic
Hoare logic (also known as Floyd–Hoare logic or Hoare rules) is a formal system with a set of logical rules for reasoning rigorously about the correctness of computer programs. It was proposed in 1969 by the British computer scientist and logician Tony Hoare, and subsequently refined by Hoare and other researchers. The original ideas were seeded by the work of Robert W. Floyd, who had published a similar system for flowcharts. Hoare triple The central feature of Hoare logic is the Hoare triple. A triple describes how the execution of a piece of code changes the state of the computation. A Hoare triple is of the form : \ C \ where P and Q are '' assertions'' and C is a ''command''.Hoare originally wrote "P\Q" rather than "\C\". P is named the '' precondition'' and Q the '' postcondition'': when the precondition is met, executing the command establishes the postcondition. Assertions are formulae in predicate logic. Hoare logic provides axioms and inference rules for all ... [...More Info...] [...Related Items...] OR: [Wikipedia] [Google] [Baidu] |
|
Formal System
A formal system is an abstract structure and formalization of an axiomatic system used for deducing, using rules of inference, theorems from axioms. In 1921, David Hilbert proposed to use formal systems as the foundation of knowledge in mathematics. The term ''formalism'' is sometimes a rough synonym for ''formal system'', but it also refers to a given style of notation, for example, Paul Dirac's bra–ket notation. Concepts A formal system has the following: * Formal language, which is a set of well-formed formulas, which are strings of symbols from an alphabet, formed by a formal grammar (consisting of production rules or formation rules). * Deductive system, deductive apparatus, or proof system, which has rules of inference that take axioms and infers theorems, both of which are part of the formal language. A formal system is said to be recursive (i.e. effective) or recursively enumerable if the set of axioms and the set of inference rules are decidable ... [...More Info...] [...Related Items...] OR: [Wikipedia] [Google] [Baidu] |
|
Refinement Calculus
The refinement calculus is a formalized approach to stepwise refinement for program construction. The required behaviour of the final executable program is specified as an abstract and perhaps non-executable "program", which is then refined by a series of correctness-preserving transformations into an efficiently executable program. Proponents include Ralph-Johan Back, who originated the approach in his 1978 PhD thesis ''On the Correctness of Refinement Steps in Program Development'', and Carroll Morgan, especially with his book Programming from Specifications' (Prentice Hall, 2nd edition, 1994, ). In the latter case, the motivation was to link Abrial's specification notation Z, via a rigorous relation of behaviour-preserving program refinement Refinement is a generic term of computer science that encompasses various approaches for producing correct computer programs and simplifying existing programs to enable their formal verification. Program refinement In formal methods, ... [...More Info...] [...Related Items...] OR: [Wikipedia] [Google] [Baidu] |
|
Abstraction (computer Science)
In software engineering and computer science, abstraction is the process of generalizing concrete details, such as attributes, away from the study of objects and systems to focus attention on details of greater importance. Abstraction is a fundamental concept in computer science and software engineering, especially within the object-oriented programming paradigm. Examples of this include: * the usage of abstract data types to separate usage from working representations of data within programs; * the concept of functions or subroutines which represent a specific way of implementing control flow; * the process of reorganizing common behavior from groups of non-abstract classes into abstract classes using inheritance and sub-classes, as seen in object-oriented programming languages. Rationale Computing mostly operates independently of the concrete world. The hardware implements a model of computation that is interchangeable with others. The software is structured in archit ... [...More Info...] [...Related Items...] OR: [Wikipedia] [Google] [Baidu] |
|
Retrenchment (computing)
Retrenchment is a technique associated with Formal Methods In computer science, formal methods are mathematics, mathematically rigorous techniques for the formal specification, specification, development, Program analysis, analysis, and formal verification, verification of software and computer hardware, ... that was introduced to address some of the perceived limitations of formal, model based refinement, for situations in which refinement might be regarded as desirable in principle, but turned out to be unusable, or nearly unusable, in practice. It was primarily developed at the School of Computer Science, University of Manchester. The most up to date perspective is in the ACM TOSEM article below. External linksThe Retrenchment Homepage* R. Banach, Graded Refinement, Retrenchment and Simulation, ACM Trans. Soft. Eng. Meth., 32, 1-69 (2023) Formal methods Software development philosophies Department of Computer Science, University of Manchester {{Comp-sci-stub ... [...More Info...] [...Related Items...] OR: [Wikipedia] [Google] [Baidu] |
|
Cliff Jones (computer Scientist)
Clifford "Cliff" B. Jones (born 1 June 1944) is a British computer scientist, specializing in research into formal methods. He undertook a late Doctor of Philosophy, DPhil at the Oxford University Computing Laboratory (now the Oxford University Department of Computer Science) under Tony Hoare, awarded in 1981. Jones' thesis proposed an extension to Hoare logic for handling concurrent programs, rely/guarantee. Prior to his DPhil, Jones worked for IBM, between the Hursley and IBM Laboratory Vienna, Vienna Laboratories. In Vienna, Jones worked with Peter Lucas (computer scientist), Peter Lucas, Dines Bjørner and others on the Vienna Development Method (VDM), originally as a method for specifying the formal semantics of programming languages, and subsequently for specifying and verifying programs. Cliff Jones was a professor at the Victoria University of Manchester in the 1980s and early 1990s, worked in industry at Harlequin for a period, and is now a Professor of Computing Scie ... [...More Info...] [...Related Items...] OR: [Wikipedia] [Google] [Baidu] |
|
Reification (computer Science)
In computer science, reification is the process by which an abstract idea about a computer program, program is turned into an explicit data model or other object created in a programming language. A computable/addressable object—a ''resource''—is created in a system as a proxy for a non computable/addressable object. By means of reification, something that was previously implicit, unexpressed, and possibly inexpressible is explicitly formulated and made available to conceptual (logical or computational) manipulation. Informally, reification is often referred to as "making something a first-class citizen" within the scope of a particular system. Some aspect of a system can be reified at ''language design time'', which is related to Reflection (computer science), reflection in programming languages. It can be applied as a stepwise refinement at ''system design time''. Reification is one of the most frequently used techniques of conceptual analysis and knowledge representation. ... [...More Info...] [...Related Items...] OR: [Wikipedia] [Google] [Baidu] |
|
Empty Set
In mathematics, the empty set or void set is the unique Set (mathematics), set having no Element (mathematics), elements; its size or cardinality (count of elements in a set) is 0, zero. Some axiomatic set theories ensure that the empty set exists by including an axiom of empty set, while in other theories, its existence can be deduced. Many possible properties of sets are vacuously true for the empty set. Any set other than the empty set is called ''non-empty''. In some textbooks and popularizations, the empty set is referred to as the "null set". However, null set is a distinct notion within the context of measure theory, in which it describes a set of measure zero (which is not necessarily empty). Notation Common notations for the empty set include "", "\emptyset", and "∅". The latter two symbols were introduced by the Bourbaki group (specifically André Weil) in 1939, inspired by the letter Ø () in the Danish orthography, Danish and Norwegian orthography, Norwegian a ... [...More Info...] [...Related Items...] OR: [Wikipedia] [Google] [Baidu] |