Log-rank Conjecture
In theoretical computer science, the log-rank conjecture states that the deterministic communication complexity of a two-party Boolean function is polynomially related to the logarithm of the rank of its input matrix. Let D(f) denote the deterministic communication complexity of a function, and let \operatorname(f) denote the rank Rank is the relative position, value, worth, complexity, power, importance, authority, level, etc. of a person or object within a ranking, such as: Level or position in a hierarchical organization * Academic rank * Diplomatic rank * Hierarchy * H ... of its input matrix M_f (over the reals). Since every protocol using up to c bits partitions M_f into at most 2^c monochromatic rectangles, and each of these has rank at most 1, :D(f) \geq \log_2 \operatorname(f). The log-rank conjecture states that D(f) is also ''upper-bounded'' by a polynomial in the log-rank: for some constant C, :D(f) = O((\log \operatorname(f))^C). The best known upper bound, due t ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Theoretical Computer Science
Theoretical computer science (TCS) is a subset of general computer science and mathematics that focuses on mathematical aspects of computer science such as the theory of computation, lambda calculus, and type theory. It is difficult to circumscribe the theoretical areas precisely. The Association for Computing Machinery, ACM's ACM SIGACT, Special Interest Group on Algorithms and Computation Theory (SIGACT) provides the following description: History While logical inference and mathematical proof had existed previously, in 1931 Kurt Gödel proved with his incompleteness theorem that there are fundamental limitations on what statements could be proved or disproved. Information theory was added to the field with a 1948 mathematical theory of communication by Claude Shannon. In the same decade, Donald Hebb introduced a mathematical model of Hebbian learning, learning in the brain. With mounting biological data supporting this hypothesis with some modification, the fields of n ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Communication Complexity
In theoretical computer science, communication complexity studies the amount of communication required to solve a problem when the input to the problem is distributed among two or more parties. The study of communication complexity was first introduced by Andrew Yao in 1979, while studying the problem of computation distributed among several machines. The problem is usually stated as follows: two parties (traditionally called Alice and Bob) each receive a (potentially different) n-bit string x and y. The ''goal'' is for Alice to compute the value of a certain function, f(x, y), that depends on both x and y, with the least amount of communication between them. While Alice and Bob can always succeed by having Bob send his whole n-bit string to Alice (who then computes the function f), the idea here is to find clever ways of calculating ''f'' with fewer than n bits of communication. Note that, unlike in computational complexity theory, communication complexity is not concerned with th ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Boolean Function
In mathematics, a Boolean function is a function whose arguments and result assume values from a two-element set (usually , or ). Alternative names are switching function, used especially in older computer science literature, and truth function (or logical function), used in logic. Boolean functions are the subject of Boolean algebra and switching theory. A Boolean function takes the form f:\^k \to \, where \ is known as the Boolean domain and k is a non-negative integer called the arity of the function. In the case where k=0, the function is a constant element of \. A Boolean function with multiple outputs, f:\^k \to \^m with m>1 is a ''vectorial'' or ''vector-valued'' Boolean function (an S-box in symmetric cryptography). There are 2^ different Boolean functions with k arguments; equal to the number of different truth tables with 2^k entries. Every k-ary Boolean function can be expressed as a propositional formula in k variables x_1,...,x_k, and two propositional formul ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Rank (linear Algebra)
In linear algebra, the rank of a matrix (mathematics), matrix is the Dimension (vector space), dimension of the vector space generated (or Linear span, spanned) by its columns. p. 48, § 1.16 This corresponds to the maximal number of linearly independent columns of . This, in turn, is identical to the dimension of the vector space spanned by its rows. Rank is thus a measure of the "Degenerate form, nondegenerateness" of the system of linear equations and linear transformation encoded by . There are multiple equivalent definitions of rank. A matrix's rank is one of its most fundamental characteristics. The rank is commonly denoted by or ; sometimes the parentheses are not written, as in .Alternative notation includes \rho (\Phi) from and . Main definitions In this section, we give some definitions of the rank of a matrix. Many definitions are possible; see #Alternative definitions, Alternative definitions for several of these. The column rank of is the dimension (linear alg ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Communication Complexity
In theoretical computer science, communication complexity studies the amount of communication required to solve a problem when the input to the problem is distributed among two or more parties. The study of communication complexity was first introduced by Andrew Yao in 1979, while studying the problem of computation distributed among several machines. The problem is usually stated as follows: two parties (traditionally called Alice and Bob) each receive a (potentially different) n-bit string x and y. The ''goal'' is for Alice to compute the value of a certain function, f(x, y), that depends on both x and y, with the least amount of communication between them. While Alice and Bob can always succeed by having Bob send his whole n-bit string to Alice (who then computes the function f), the idea here is to find clever ways of calculating ''f'' with fewer than n bits of communication. Note that, unlike in computational complexity theory, communication complexity is not concerned with th ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Rank (linear Algebra)
In linear algebra, the rank of a matrix (mathematics), matrix is the Dimension (vector space), dimension of the vector space generated (or Linear span, spanned) by its columns. p. 48, § 1.16 This corresponds to the maximal number of linearly independent columns of . This, in turn, is identical to the dimension of the vector space spanned by its rows. Rank is thus a measure of the "Degenerate form, nondegenerateness" of the system of linear equations and linear transformation encoded by . There are multiple equivalent definitions of rank. A matrix's rank is one of its most fundamental characteristics. The rank is commonly denoted by or ; sometimes the parentheses are not written, as in .Alternative notation includes \rho (\Phi) from and . Main definitions In this section, we give some definitions of the rank of a matrix. Many definitions are possible; see #Alternative definitions, Alternative definitions for several of these. The column rank of is the dimension (linear alg ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Communication
Communication (from la, communicare, meaning "to share" or "to be in relation with") is usually defined as the transmission of information. The term may also refer to the message communicated through such transmissions or the field of inquiry studying them. There are many disagreements about its precise definition. John Peters argues that the difficulty of defining communication emerges from the fact that communication is both a universal phenomenon and a specific discipline of institutional academic study. One definitional strategy involves limiting what can be included in the category of communication (for example, requiring a "conscious intent" to persuade). By this logic, one possible definition of communication is the act of developing meaning among entities or groups through the use of sufficiently mutually understood signs, symbols, and semiotic conventions. An important distinction is between verbal communication, which happens through the use of a language, and ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Computational Complexity Theory
In theoretical computer science and mathematics, computational complexity theory focuses on classifying computational problems according to their resource usage, and relating these classes to each other. A computational problem is a task solved by a computer. A computation problem is solvable by mechanical application of mathematical steps, such as an algorithm. A problem is regarded as inherently difficult if its solution requires significant resources, whatever the algorithm used. The theory formalizes this intuition, by introducing mathematical models of computation to study these problems and quantifying their computational complexity, i.e., the amount of resources needed to solve them, such as time and storage. Other measures of complexity are also used, such as the amount of communication (used in communication complexity), the number of gates in a circuit (used in circuit complexity) and the number of processors (used in parallel computing). One of the roles of compu ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Conjectures
In mathematics, a conjecture is a conclusion or a proposition that is proffered on a tentative basis without proof. Some conjectures, such as the Riemann hypothesis (still a conjecture) or Fermat's Last Theorem (a conjecture until proven in 1995 by Andrew Wiles), have shaped much of mathematical history as new areas of mathematics are developed in order to prove them. Important examples Fermat's Last Theorem In number theory, Fermat's Last Theorem (sometimes called Fermat's conjecture, especially in older texts) states that no three positive integers a, ''b'', and ''c'' can satisfy the equation ''a^n + b^n = c^n'' for any integer value of ''n'' greater than two. This theorem was first conjectured by Pierre de Fermat in 1637 in the margin of a copy of ''Arithmetica'', where he claimed that he had a proof that was too large to fit in the margin. The first successful proof was released in 1994 by Andrew Wiles, and formally published in 1995, after 358 years of effort by mat ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |