In
data compression
In information theory, data compression, source coding, or bit-rate reduction is the process of encoding information using fewer bits than the original representation. Any particular compression is either lossy or lossless. Lossless compressi ...
, a universal code for integers is a
prefix code that maps the positive integers onto binary codewords, with the additional property that whatever the true
probability distribution
In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment. It is a mathematical description of a random phenomeno ...
on integers, as long as the distribution is monotonic (i.e., ''p''(''i'') ≥ ''p''(''i'' + 1) for all positive ''i''), the
expected lengths of the codewords are within a constant factor of the expected lengths that the
optimal code for that probability distribution would have assigned. A universal code is ''asymptotically optimal'' if the ratio between actual and optimal
expected lengths is bounded by a function of the
information entropy
In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable X, which takes values in the alphabet \ ...
of the code that, in addition to being bounded, approaches 1 as entropy approaches infinity.
In general, most prefix codes for integers assign longer codewords to larger integers. Such a code can be used to efficiently communicate a message drawn from a set of possible messages, by simply ordering the set of messages by decreasing probability and then sending the index of the intended message. Universal codes are generally not used for precisely known probability distributions, and no universal code is known to be optimal for any distribution used in practice.
A universal code should not be confused with
universal source coding
Universal is the adjective for universe.
Universal may also refer to:
Companies
* NBCUniversal, a media and entertainment company
** Universal Animation Studios, an American Animation studio, and a subsidiary of NBCUniversal
** Universal TV, a ...
, in which the data compression method need not be a fixed prefix code and the ratio between actual and optimal expected lengths must approach one. However, note that an asymptotically optimal universal code can be used on
independent identically-distributed sources, by using increasingly large
blocks
Block or blocked may refer to:
Arts, entertainment and media Broadcasting
* Block programming, the result of a programming strategy in broadcasting
* W242BX, a radio station licensed to Greenville, South Carolina, United States known as ''96.3 ...
, as a method of universal source coding.
Universal and non-universal codes
These are some universal codes for integers; an asterisk (
*) indicates a code that can be trivially restated in
lexicographical order
In mathematics, the lexicographic or lexicographical order (also known as lexical order, or dictionary order) is a generalization of the alphabetical order of the dictionaries to sequences of ordered symbols or, more generally, of elements of ...
, while a double dagger (
‡
A dagger, obelisk, or obelus is a typographical mark that usually indicates a footnote if an asterisk has already been used. The symbol is also used to indicate death (of people) or extinction (of species). It is one of the modern descenda ...
) indicates a code that is asymptotically optimal:
*
Elias gamma coding
Elias γ code or Elias gamma code is a universal code encoding positive integers developed by Peter Elias. It is used most commonly when coding integers whose upper-bound cannot be determined beforehand.
Encoding
To code a number ''x'' � ...
*
*
Elias delta coding
Elias δ code or Elias delta code is a universal code encoding the positive integers developed by Peter Elias.
Encoding
To code a number ''X'' ≥ 1:
# Let ''N'' = ⌊log2 ''X''⌋; be the highest power of 2 in ''X'', so 2''N'' ≤ ' ...
* ‡
*
Elias omega coding
Elias ω coding or Elias omega coding is a universal code encoding the positive integers developed by Peter Elias. Like Elias gamma coding and Elias delta coding, it works by prefixing the positive integer with a representation of its order of ma ...
* ‡
*
Exp-Golomb coding *, which has Elias gamma coding as a special case. (Used in
H.264/MPEG-4 AVC)
*
Fibonacci coding
In mathematics and computing, Fibonacci coding is a universal code which encodes positive integers into binary code words. It is one example of representations of integers based on Fibonacci numbers. Each code word ends with "11" and contains n ...
*
Levenshtein coding * ‡, the original universal coding techniqu
*
Byte coding
The byte is a unit of digital information that most commonly consists of eight bits. Historically, the byte was the number of bits used to encode a single character of text in a computer and for this reason it is the smallest addressable unit ...
where a special bit pattern (with at least two bits) is used to mark the end of the code — for example, if an integer is encoded as a sequence of
nibble
In computing, a nibble (occasionally nybble, nyble, or nybl to match the spelling of byte) is a four-bit aggregation, or half an octet. It is also known as half-byte or tetrade. In a networking or telecommunication context, the nibble is ofte ...
s representing digits in
base 15
There are many different numeral systems, that is, writing systems for expressing numbers.
By culture / time period
By type of notation
Numeral systems are classified here as to whether they use positional notation (also known as place-value ...
instead of the more natural
base 16
In mathematics and computing, the hexadecimal (also base-16 or simply hex) numeral system is a positional numeral system that represents numbers using a radix (base) of 16. Unlike the decimal system representing numbers using 10 symbols, hexad ...
, then the highest nibble value (i.e., a sequence of four ones in binary) can be used to indicate the end of the integer.
*
Variable-length quantity
A variable-length quantity (VLQ) is a universal code that uses an arbitrary number of binary octets (eight-bit bytes) to represent an arbitrarily large integer. A VLQ is essentially a base-128 representation of an unsigned integer with the additi ...
These are non-universal ones:
*
Unary coding
Unary coding, or the unary numeral system and also sometimes called thermometer code, is an entropy encoding that represents a natural number, ''n'', with a code of length ''n'' + 1 ( or ''n'' ), usually ''n'' ones followed by a zero (if ...
, which is used in Elias codes
*
Rice coding
Golomb coding is a lossless data compression method using a family of data compression codes invented by Solomon W. Golomb in the 1960s. Alphabets following a geometric distribution will have a Golomb code as an optimal prefix code, making Golomb ...
, which is used in the
FLAC
FLAC (; Free Lossless Audio Codec) is an audio coding format for lossless compression of digital audio, developed by the Xiph.Org Foundation, and is also the name of the free software project producing the FLAC tools, the reference software p ...
audio codec
An audio codec is a device or computer program capable of encoding or decoding a digital data stream (a codec) that encodes or decodes audio. In software, an audio codec is a computer program implementing an algorithm that compresses and decompres ...
and which has unary coding as a special case
*
Golomb coding
Golomb coding is a lossless data compression method using a family of data compression codes invented by Solomon W. Golomb in the 1960s. Alphabets following a geometric distribution will have a Golomb code as an optimal prefix code, making Golomb ...
, which has Rice coding and unary coding as special cases.
Their nonuniversality can be observed by noticing that, if any of these are used to code the
Gauss–Kuzmin distribution
In mathematics, the Gauss–Kuzmin distribution is a discrete probability distribution that arises as the limit probability distribution of the coefficients in the continued fraction expansion of a random variable uniformly distributed in (0,&nbs ...
or the
Zeta distribution
In probability theory and statistics, the zeta distribution is a discrete probability distribution. If ''X'' is a zeta-distributed random variable with parameter ''s'', then the probability that ''X'' takes the integer value ''k'' is given by the ...
with parameter s=2, expected codeword length is infinite. For example, using unary coding on the Zeta distribution yields an expected length of
:
On the other hand, using the universal Elias gamma coding for the Gauss–Kuzmin distribution results in an expected codeword length (about 3.51 bits) near entropy (about 3.43 bit
Relationship to practical compression
Huffman coding
In computer science and information theory, a Huffman code is a particular type of optimal prefix code that is commonly used for lossless data compression. The process of finding or using such a code proceeds by means of Huffman coding, an algor ...
and
arithmetic coding
Arithmetic coding (AC) is a form of entropy encoding used in lossless data compression. Normally, a string of characters is represented using a fixed number of bits per character, as in the ASCII code. When a string is converted to arithmetic ...
(when they can be used) give at least as good, and often better compression than any universal code.
However, universal codes are useful when Huffman coding cannot be used — for example, when one does not know the exact probability of each message, but only knows the rankings of their probabilities.
Universal codes are also useful when Huffman codes are inconvenient. For example, when the transmitter but not the receiver knows the probabilities of the messages, Huffman coding requires an overhead of transmitting those probabilities to the receiver. Using a universal code does not have that overhead.
Each universal code, like each other self-delimiting (prefix) binary code, has its own "implied probability distribution" given by where is the length of the ''i''th codeword and ''P''(''i'') is the corresponding symbol's probability. If the actual message probabilities are ''Q''(''i'') and
Kullback–Leibler divergence
In mathematical statistics, the Kullback–Leibler divergence (also called relative entropy and I-divergence), denoted D_\text(P \parallel Q), is a type of statistical distance: a measure of how one probability distribution ''P'' is different fro ...
is minimized by the code with , then the optimal Huffman code for that set of messages will be equivalent to that code. Likewise, how close a code is to optimal can be measured by this divergence. Since universal codes are simpler and faster to encode and decode than Huffman codes (which is, in turn, simpler and faster than
arithmetic encoding
Arithmetic () is an elementary part of mathematics that consists of the study of the properties of the traditional operations on numbers—addition, subtraction, multiplication, division, exponentiation, and extraction of roots. In the 19th cen ...
), the universal code would be preferable in cases where
is sufficiently small
For any
geometric distribution
In probability theory and statistics, the geometric distribution is either one of two discrete probability distributions:
* The probability distribution of the number ''X'' of Bernoulli trials needed to get one success, supported on the set \; ...
(an exponential distribution on integers), a Golomb code is optimal. With universal codes, the implicit distribution is approximately a
power law
In statistics, a power law is a functional relationship between two quantities, where a relative change in one quantity results in a proportional relative change in the other quantity, independent of the initial size of those quantities: one qua ...
such as
(more precisely, a
Zipf distribution).
For the
Fibonacci code, the implicit distribution is approximately
, with
:
where
is the
golden ratio
In mathematics, two quantities are in the golden ratio if their ratio is the same as the ratio of their sum to the larger of the two quantities. Expressed algebraically, for quantities a and b with a > b > 0,
where the Greek letter phi ( ...
. For the ternary
comma code A comma code is a type of prefix-free code in which a comma, a particular symbol or sequence of symbols, occurs at the end of a code word and never occurs otherwise. This is an intuitive way to express arrays.
For example, Fibonacci coding is a com ...
(i.e., encoding in base 3, represented with 2 bits per symbol), the implicit distribution is a power law with
. These distributions thus have near-optimal codes with their respective power laws.
External links
Data Compression by Debra A. Lelewer and Daniel S. Hirschberg (
University of California, Irvine
The University of California, Irvine (UCI or UC Irvine) is a public land-grant research university in Irvine, California. One of the ten campuses of the University of California system, UCI offers 87 undergraduate degrees and 129 graduate and p ...
)
*
Information Theory, Inference, and Learning Algorithms', by
David MacKay, has a chapter on codes for integers, including an introduction to Elias codes.
Кодирование целых чиселhas mostly English-language papers on universal and other integer codes.
{{Compression Methods
Data compression
Lossless compression algorithms