HOME

TheInfoList



OR:

In
information theory Information theory is the mathematical study of the quantification (science), quantification, Data storage, storage, and telecommunications, communication of information. The field was established and formalized by Claude Shannon in the 1940s, ...
, the graph entropy is a measure of the information rate achievable by communicating symbols over a channel in which certain pairs of values may be confused. This measure, first introduced by Körner in the 1970s, has since also proven itself useful in other settings, including combinatorics.


Definition

Let G = (V, E) be an
undirected graph In discrete mathematics, particularly in graph theory, a graph is a structure consisting of a set of objects where some pairs of the objects are in some sense "related". The objects are represented by abstractions called '' vertices'' (also call ...
. The graph entropy of G, denoted H(G) is defined as ::H(G) = \min_ I(X ; Y) where X is chosen
uniformly Uniform distribution may refer to: * Continuous uniform distribution * Discrete uniform distribution * Uniform distribution (ecology) * Equidistributed sequence In mathematics, a sequence (''s''1, ''s''2, ''s''3, ...) of real numbers is said to be ...
from V, Y ranges over independent sets of G, the joint distribution of X and Y is such that X\in Y with probability one, and I(X ; Y) is the
mutual information In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual Statistical dependence, dependence between the two variables. More specifically, it quantifies the "Information conten ...
of X and Y.G. Simonyi, "Perfect graphs and graph entropy. An updated survey," Perfect Graphs, John Wiley and Sons (2001) pp. 293-328, Definition 2” That is, if we let \mathcal denote the independent vertex sets in G, we wish to find the joint distribution X,Y on V \times \mathcal with the lowest mutual information such that (i) the marginal distribution of the first term is uniform and (ii) in samples from the distribution, the second term contains the first term almost surely. The mutual information of X and Y is then called the entropy of G.


Properties

* Monotonicity. If G_1 is a subgraph of G_2 on the same vertex set, then H(G_1) \leq H(G_2). * Subadditivity. Given two graphs G_1 = (V, E_1) and G_2 = (V, E_2) on the same set of vertices, the
graph union In the mathematical field of graph theory, graph operations are operations which produce new graphs from initial ones. They include both unary (one input) and binary (two input) operations. Unary operations Unary operations create a new graph f ...
G_1 \cup G_2 = (V, E_1 \cup E_2) satisfies H(G_1 \cup G_2) \leq H(G_1) + H(G_2). * Arithmetic mean of disjoint unions. Let G_1, G_2, \cdots, G_k be a sequence of graphs on disjoint sets of vertices, with n_1, n_2, \cdots, n_k vertices, respectively. Then H(G_1 \cup G_2 \cup \cdots G_k) = \tfrac\sum_^. Additionally, simple formulas exist for certain families classes of graphs. * Complete balanced k-partite graphs have entropy \log_2 k. In particular, ** Edge-less graphs have entropy 0. **
Complete graph In the mathematical field of graph theory, a complete graph is a simple undirected graph in which every pair of distinct vertices is connected by a unique edge. A complete digraph is a directed graph in which every pair of distinct vertices i ...
s on n vertices have entropy \log_2 n. ** Complete balanced
bipartite graphs In the mathematical field of graph theory, a bipartite graph (or bigraph) is a graph whose vertices can be divided into two disjoint and independent sets U and V, that is, every edge connects a vertex in U to one in V. Vertex sets U and V a ...
have entropy 1. * Complete
bipartite graphs In the mathematical field of graph theory, a bipartite graph (or bigraph) is a graph whose vertices can be divided into two disjoint and independent sets U and V, that is, every edge connects a vertex in U to one in V. Vertex sets U and V a ...
with n vertices in one partition and m in the other have entropy H\left(\frac\right), where H is the
binary entropy function Binary may refer to: Science and technology Mathematics * Binary number, a representation of numbers using only two values (0 and 1) for each digit * Binary function, a function that takes two arguments * Binary operation, a mathematical op ...
.


Example

Here, we use properties of graph entropy to provide a simple proof that a complete graph G on n vertices cannot be expressed as the union of fewer than \log_2 n bipartite graphs. ''Proof'' By monotonicity, no bipartite graph can have graph entropy greater than that of a complete bipartite graph, which is bounded by 1. Thus, by sub-additivity, the union of k bipartite graphs cannot have entropy greater than k. Now let G = (V, E) be a complete graph on n vertices. By the properties listed above, H(G) = \log_2 n. Therefore, the union of fewer than \log_2 n bipartite graphs cannot have the same entropy as G, so G cannot be expressed as such a union. \blacksquare


General References

*


Notes

{{reflist Information theory Graph theory