HOME





Hartley Function
The Hartley function is a measure of uncertainty, introduced by Ralph Hartley in 1928. If a sample from a finite set ''A'' uniformly at random is picked, the information revealed after the outcome is known is given by the Hartley function : H_0(A) := \mathrm_b \vert A \vert , where denotes the cardinality of ''A''. If the base of the logarithm is 2, then the unit of uncertainty is the shannon (more commonly known as bit). If it is the natural logarithm, then the unit is the nat. Hartley used a base-ten logarithm, and with this base, the unit of information is called the hartley (aka ban or dit) in his honor. It is also known as the Hartley entropy or max-entropy. Hartley function, Shannon entropy, and Rényi entropy The Hartley function coincides with the Shannon entropy (as well as with the Rényi entropies of all orders) in the case of a uniform probability distribution. It is a special case of the Rényi entropy since: :H_0(X) = \frac 1 \log \sum_^ p_i^0 = \log , \math ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Uncertainty
Uncertainty or incertitude refers to situations involving imperfect or unknown information. It applies to predictions of future events, to physical measurements that are already made, or to the unknown, and is particularly relevant for decision-making. Uncertainty arises in partially observable or stochastic environments, as well as due to ignorance, Laziness, indolence, or both. It arises in any number of fields, including insurance, philosophy, physics, statistics, economics, finance, medicine, psychology, sociology, engineering, metrology, meteorology, ecology and information science. Concepts Although the terms are used in various ways among the general public, many specialists in decision theory, statistics and other quantitative fields have defined uncertainty, risk, and their measurement as: Uncertainty The lack of certainty, a state of limited knowledge where it is impossible to exactly describe the existing state, a future outcome, or more than one possible outcome. ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Ralph Hartley
Ralph Vinton Lyon Hartley (November 30, 1888 – May 1, 1970) was an American electronics researcher. He invented the Hartley oscillator and the Hartley transform, and contributed to the foundations of information theory. His legacy includes the naming of the hartley, a unit of information equal to one decimal digit, after him. Biography Hartley was born in Sprucemont, Nevada, and attended the University of Utah, receiving an A.B. degree in 1909. He became a Rhodes Scholar at St Johns, Oxford University, in 1910 and received a B.A. degree in 1912 and a B.Sc. degree in 1913. He married Florence Vail of Brooklyn on March 21, 1916. The Hartleys had no children. He returned to the United States and was employed at the Research Laboratory of the Western Electric Company. In 1915 he was in charge of radio receiver development for the Bell System transatlantic radiotelephone tests. For this he developed the Hartley oscillator and also a neutralizing circuit to eliminate triode ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Cardinality
The thumb is the first digit of the hand, next to the index finger. When a person is standing in the medical anatomical position (where the palm is facing to the front), the thumb is the outermost digit. The Medical Latin English noun for thumb is ''pollex'' (compare ''hallux'' for big toe), and the corresponding adjective for thumb is ''pollical''. Definition Thumb and fingers The English word ''finger'' has two senses, even in the context of appendages of a single typical human hand: 1) Any of the five terminal members of the hand. 2) Any of the four terminal members of the hand, other than the thumb. Linguistically, it appears that the original sense was the first of these two: (also rendered as ) was, in the inferred Proto-Indo-European language, a suffixed form of (or ), which has given rise to many Indo-European-family words (tens of them defined in English dictionaries) that involve, or stem from, concepts of fiveness. The thumb shares the following with each of ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Base Of A Logarithm
In mathematics, the logarithm of a number is the exponent by which another fixed value, the base, must be raised to produce that number. For example, the logarithm of to base is , because is to the rd power: . More generally, if , then is the logarithm of to base , written , so . As a single-variable function, the logarithm to base is the inverse of exponentiation with base . The logarithm base is called the ''decimal'' or ''common'' logarithm and is commonly used in science and engineering. The ''natural'' logarithm has the number  as its base; its use is widespread in mathematics and physics because of its very simple derivative. The ''binary'' logarithm uses base and is widely used in computer science, information theory, music theory, and photography. When the base is unambiguous from the context or irrelevant it is often omitted, and the logarithm is written . Logarithms were introduced by John Napier in 1614 as a means of simplifying calculations. ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Shannon (unit)
The shannon (symbol: Sh) is a unit of information named after Claude Shannon, the founder of information theory. IEC 80000-13 defines the shannon as the information content associated with an event when the probability of the event occurring is . It is understood as such within the realm of information theory, and is conceptually distinct from the bit, a term used in data processing and storage to denote a single instance of a binary signal. A sequence of ''n'' binary symbols (such as contained in computer memory or a binary data transmission) is properly described as consisting of ''n'' bits, but the information content of those ''n'' symbols may be more or less than ''n'' shannons depending on the ''a priori'' probability of the actual sequence of symbols. The shannon also serves as a unit of the information entropy of an event, which is defined as the expected value of the information content of the event (i.e., the probability-weighted average of the information content of ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Natural Logarithm
The natural logarithm of a number is its logarithm to the base of a logarithm, base of the e (mathematical constant), mathematical constant , which is an Irrational number, irrational and Transcendental number, transcendental number approximately equal to . The natural logarithm of is generally written as , , or sometimes, if the base is implicit, simply . Parentheses are sometimes added for clarity, giving , , or . This is done particularly when the argument to the logarithm is not a single symbol, so as to prevent ambiguity. The natural logarithm of is the exponentiation, power to which would have to be raised to equal . For example, is , because . The natural logarithm of itself, , is , because , while the natural logarithm of is , since . The natural logarithm can be defined for any positive real number as the Integral, area under the curve from to (with the area being negative when ). The simplicity of this definition, which is matched in many other formulas ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Nat (unit)
The natural unit of information (symbol: nat), sometimes also nit or nepit, is a unit of information or information entropy, based on natural logarithms and powers of e (mathematical constant), ''e'', rather than the powers of 2 and binary logarithm, base 2 logarithms, which define the shannon (unit), shannon. This unit is also known by its unit symbol, the nat. One nat is the information content of an event when the probability of that event occurring is 1/e (mathematical constant), ''e''. One nat is equal to  shannon (unit), shannons ≈ 1.44 Sh or, equivalently,  hartley (unit), hartleys ≈ 0.434 Hart. History Boulton and Chris Wallace (computer scientist), Wallace used the term ''nit'' in conjunction with minimum message length, which was subsequently changed by the minimum description length community to ''nat'' to avoid confusion with the nit (unit), nit used as a unit of luminance. Alan Turing used the ''natural ban (unit), ban''. Entropy Shan ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Base-ten Logarithm
In mathematics, the common logarithm (aka "standard logarithm") is the logarithm with base 10. It is also known as the decadic logarithm, the decimal logarithm and the Briggsian logarithm. The name "Briggsian logarithm" is in honor of the British mathematician Henry Briggs who conceived of and developed the values for the "common logarithm". Historically', the "common logarithm" was known by its Latin name ''logarithmus decimalis'' or ''logarithmus decadis''. The mathematical notation for using the common logarithm is , , or sometimes with a capital ; on calculators, it is printed as "log", but mathematicians usually mean natural logarithm (logarithm with base ≈ 2.71828) rather than common logarithm when writing "log". Before the early 1970s, handheld electronic calculators were not available, and mechanical calculators capable of multiplication were bulky, expensive and not widely available. Instead, tables of base-10 logarithms were used in science, engineering and navig ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Hartley (unit)
The hartley (symbol Hart), also called a ban, or a dit (short for "decimal digit"), is a logarithmic unit that measures information or entropy, based on base 10 logarithms and powers of 10. One hartley is the information content of an event if the probability of that event occurring is . It is therefore equal to the information contained in one decimal digit (or dit), assuming ''a priori'' equiprobability of each possible value. It is named after Ralph Hartley. If base 2 logarithms and powers of 2 are used instead, then the unit of information is the shannon or bit, which is the information content of an event if the probability of that event occurring is . Natural logarithms and powers of e define the nat. One ban corresponds to ln(10) nat = log2(10) Sh, or approximately 2.303 nat, or 3.322 bit (3.322 Sh). A deciban is one tenth of a ban (or about 0.332 Sh); the name is formed from ''ban'' by the SI prefix ''deci-''. Though there is no associated SI unit, information ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Shannon Entropy
Shannon may refer to: People * Shannon (given name) * Shannon (surname) * Shannon (American singer), stage name of singer Brenda Shannon Greene (born 1958) * Shannon (South Korean singer), British-South Korean singer and actress Shannon Arrum Williams (born 1998) * Shannon, intermittent stage name of English singer-songwriter Marty Wilde (born 1939) Places Australia * Shannon, Tasmania, a locality * Hundred of Shannon, a cadastral unit in South Australia * Shannon, a former name for the area named Calomba, South Australia since 1916 * Shannon River (Western Australia) * Shannon, Western Australia, a locality in the Shire of Manjimup * Shannon National Park, a national park in Western Australia Canada * Shannon, New Brunswick, a community * Shannon, Quebec, a city * Shannon Bay, former name of Darrell Bay, British Columbia * Shannon Falls, a waterfall in British Columbia Ireland * River Shannon, the longest river in Ireland ** Shannon Cave, a subterranean section o ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Rényi Entropy
In information theory, the Rényi entropy is a quantity that generalizes various notions of Entropy (information theory), entropy, including Hartley entropy, Shannon entropy, collision entropy, and min-entropy. The Rényi entropy is named after Alfréd Rényi, who looked for the most general way to quantify information while preserving additivity for independent events. In the context of fractal dimension estimation, the Rényi entropy forms the basis of the concept of generalized dimensions. The Rényi entropy is important in ecology and statistics as diversity indices, index of diversity. The Rényi entropy is also important in quantum information, where it can be used as a measure of Quantum entanglement, entanglement. In the Heisenberg XY spin chain model, the Rényi entropy as a function of can be calculated explicitly because it is an automorphic function with respect to a particular subgroup of the modular group. In theoretical computer science, the min-entropy is used in ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Min-entropy
The min-entropy, in information theory, is the smallest of the Rényi family of entropies, corresponding to the most conservative way of measuring the unpredictability of a set of outcomes, as the negative logarithm of the probability of the ''most likely'' outcome. The various Rényi entropies are all equal for a uniform distribution, but measure the unpredictability of a nonuniform distribution in different ways. The min-entropy is never greater than the ordinary or Shannon entropy (which measures the average unpredictability of the outcomes) and that in turn is never greater than the Hartley or max-entropy, defined as the logarithm of the ''number'' of outcomes with nonzero probability. As with the classical Shannon entropy and its quantum generalization, the von Neumann entropy, one can define a conditional version of min-entropy. The conditional quantum min-entropy is a one-shot, or conservative, analog of conditional quantum entropy. To interpret a conditional informat ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]