HOME

TheInfoList



OR:

Superoptimization is the process where a
compiler In computing, a compiler is a computer program that Translator (computing), translates computer code written in one programming language (the ''source'' language) into another language (the ''target'' language). The name "compiler" is primaril ...
automatically finds the optimal sequence for a loop-free sequence of instructions. Real-world compilers generally cannot produce genuinely ''optimal'' code, and while most standard
compiler optimization An optimizing compiler is a compiler designed to generate code that is optimized in aspects such as minimizing program execution time, memory usage, storage size, and power consumption. Optimization is generally implemented as a sequence of op ...
s only improve code partly, a superoptimizer's goal is to find the optimal sequence, the
canonical form In mathematics and computer science, a canonical, normal, or standard form of a mathematical object is a standard way of presenting that object as a mathematical expression. Often, it is one which provides the simplest representation of an obje ...
. Superoptimizers can be used to improve conventional optimizers by highlighting missed opportunities so a human can write additional rules.


History

The term superoptimization was first coined by Alexia Massalin in the 1987 paper ''Superoptimizer: A Look at the Smallest Program''. The label "program optimization" has been given to a field that does not aspire to optimize but only to improve. This misnomer forced Massalin to call her system a superoptimizer, which is actually an optimizer to find an optimal program. In 1992, the GNU Superoptimizer (GSO) was developed to integrate into the
GNU Compiler Collection The GNU Compiler Collection (GCC) is a collection of compilers from the GNU Project that support various programming languages, Computer architecture, hardware architectures, and operating systems. The Free Software Foundation (FSF) distributes ...
(GCC). Later work further developed and extended these ideas.


Techniques

Traditionally, superoptimizing is performed via exhaustive
brute-force search In computer science, brute-force search or exhaustive search, also known as generate and test, is a very general problem-solving technique and algorithmic paradigm that consists of Iteration#Computing, systematically checking all possible candida ...
in the space of valid instruction sequences. This is a costly method, and largely impractical for general-purpose compilers. Yet, it has been shown to be useful in optimizing performance-critical inner loops. It is also possible to use a
SMT solver In computer science and mathematical logic, satisfiability modulo theories (SMT) is the problem of determining whether a mathematical formula is satisfiable. It generalizes the Boolean satisfiability problem (SAT) to more complex formulas involv ...
to approach the problem, vastly improving the search efficiency (although inputs more complex than a
basic block In compiler construction, a basic block is a straight-line code sequence with no branches in except to the entry and no branches out except at the exit. This restricted form makes a basic block highly amenable to analysis. Compilers usually decom ...
remains out of reach). In 2001, goal-directed superoptimizing was demonstrated in the Denali project by Compaq research. In 2006, answer set
declarative programming In computer science, declarative programming is a programming paradigm—a style of building the structure and elements of computer programs—that expresses the logic of a computation without describing its control flow. Many languages that ap ...
was applied to superoptimization in the ''Total Optimisation using Answer Set Technology'' (TOAST) project at the
University of Bath The University of Bath is a public research university in Bath, England. Bath received its royal charter in 1966 as Bath University of Technology, along with a number of other institutions following the Robbins Report. Like the University ...
. Superoptimization can be used to automatically generate general-purpose peephole optimizers.


Publicly available superoptimizers

Several superoptimizers are available for free download. * For the x86 family of instruction sets: ** GNU superoptimizer (superopt) (GSO) (1992) – also supports many other ISAs ** STOKE, a stochastic optimizer for
x86-64 x86-64 (also known as x64, x86_64, AMD64, and Intel 64) is a 64-bit extension of the x86 instruction set architecture, instruction set. It was announced in 1999 and first available in the AMD Opteron family in 2003. It introduces two new ope ...
x86 assembly language x86 assembly language is a family of Low-level programming language, low-level programming languages that are used to produce object code for the x86 class of processors. These languages provide backward compatibility with CPUs dating back to th ...
. * For ARM: ** Unbounded Superoptimizer, transforming LLVM IR into ARMv7-A assembly * For embedded systems: ** PIC microcontroller SuperOptimizer (2003) ** A feasibility study by Embecosm (2014) for AVR, based on GSOSuperoptimization – EmbecosmSource Code
/ref> * For the JVM: **
Clojure Clojure (, like ''closure'') is a dynamic programming language, dynamic and functional programming, functional dialect (computing), dialect of the programming language Lisp (programming language), Lisp on the Java (software platform), Java platfo ...
superoptimizer for the
Java virtual machine A Java virtual machine (JVM) is a virtual machine that enables a computer to run Java programs as well as programs written in other languages that are also compiled to Java bytecode. The JVM is detailed by a specification that formally descr ...
(2012) * For LLVM IR: ** souper superoptimizer for programs in the
LLVM LLVM, also called LLVM Core, is a target-independent optimizer and code generator. It can be used to develop a Compiler#Front end, frontend for any programming language and a Compiler#Back end, backend for any instruction set architecture. LLVM i ...
intermediate language. * For WebAssembly ** slumps provides superoptimization for WASM programs based on souper.


See also

* Peephole optimization *
Dead code elimination In compiler theory, dead-code elimination (DCE, dead-code removal, dead-code stripping, or dead-code strip) is a compiler optimization to remove dead code (code that does not affect the program results). Removing such code has several benefits: i ...
* Metacompilation


References

{{Compiler optimizations Compiler optimizations