Lookup3
The Jenkins hash functions are a family of non-cryptographic hash functions for multi-byte keys designed by Bob Jenkins. The first one was formally published in 1997. The hash functions one_at_a_time Jenkins's one_at_a_time hash is adapted here from a WWW page by Bob Jenkins, which is an expanded version of his '' Dr. Dobb's'' article. It was originally created to fulfill certain requirements described by Colin Plumb, a cryptographer, but was ultimately not put to use. uint32_t jenkins_one_at_a_time_hash(const uint8_t* key, size_t length) Sample hash values for one_at_a_time hash function. one_at_a_time("a", 1) 0xca2e9442 one_at_a_time("The quick brown fox jumps over the lazy dog", 43) 0x519e91f5 The avalanche behavior of this hash is shown on the right. Each of the 24 rows corresponds to a single bit in the 3-byte input key, and each of the 32 columns corresponds to a bit in the output hash. Colors are chosen by how well the input key bit affects the given output has ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Non-cryptographic Hash Function
The non-cryptographic hash functions (NCHFs) are hash functions intended for applications that do not need the rigorous security requirements of the cryptographic hash functions (e.g., preimage resistance) and therefore can be faster and less resource-intensive. Typical examples of CPU-optimized non-cryptographic hashes include FNV-1a and Murmur3. Some non-cryptographic hash functions are used in cryptographic applications (usually in combination with other cryptographic primitives); in this case they are described as universal hash functions. Applications and requirements Among the typical uses of non-cryptographic hash functions are bloom filters, hash tables, and count sketches. These applications require, in addition to speed, Discrete uniform distribution, uniform distribution and Avalanche effect, avalanche properties. Collision resistance is an additional feature that can be useful against hash flooding attacks; simple NCHFs, like the cyclic redundancy check (CRC), have ess ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Hash Flooding
In cryptography, a collision attack on a cryptographic hash tries to find two inputs producing the same hash value, i.e. a hash collision. This is in contrast to a preimage attack where a specific target hash value is specified. There are roughly two types of collision attacks: ;Classical collision attack: Find two different messages ''m''1 and ''m''2 such that ''hash''(''m''1) = ''hash''(''m''2). More generally: ;Chosen-prefix collision attack: Given two different prefixes ''p''1 and ''p''2, find two suffixes ''s''1 and ''s''2 such that ''hash''(''p''1 ∥ ''s''1) = ''hash''(''p''2 ∥ ''s''2), where ∥ denotes the concatenation operation. Classical collision attack Much like symmetric-key ciphers are vulnerable to brute force attacks, every cryptographic hash function is inherently vulnerable to collisions using a birthday attack. Due to the birthday problem, these attacks are much faster than a brute force would be. A hash of ''n'' bits can be broken in 2''n''/2 time steps (ev ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Fletcher%27s Checksum
The Fletcher checksum is an algorithm for computing a position-dependent checksum devised by John G. Fletcher (1934–2012) at Lawrence Livermore Labs in the late 1970s. The objective of the Fletcher checksum was to provide error-detection properties approaching those of a cyclic redundancy check but with the lower computational effort associated with summation techniques. The algorithm Review of simple checksums As with simpler checksum algorithms, the Fletcher checksum involves dividing the binary data word to be protected from errors into short "blocks" of bits and computing the modular sum of those blocks. (Note that the terminology used in this domain can be confusing. The data to be protected, in its entirety, is referred to as a "word", and the pieces into which it is divided are referred to as "blocks".) As an example, the data may be a message to be transmitted consisting of 136 characters, each stored as an 8-bit byte, making a data word of 1088 bits in total. A ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Computation Of Cyclic Redundancy Checks
Computation of a cyclic redundancy check is derived from the mathematics of polynomial division, modulo two. In practice, it resembles long division of the binary message string, with a fixed number of zeroes appended, by the "generator polynomial" string except that exclusive or operations replace subtractions. Division of this type is efficiently realised in hardware by a modified shift register, and in software by a series of equivalent algorithms, starting with simple code close to the mathematics and becoming faster (and arguably more obfuscated) through byte-wise parallelism and space–time tradeoffs. Various CRC standards extend the polynomial division algorithm by specifying an initial shift register value, a final Exclusive-Or step and, most critically, a bit ordering (endianness). As a result, the code seen in practice deviates confusingly from "pure" division, and the register may shift left or right. Example As an example of implementing polynomial divisio ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Hierarchical Data Format
Hierarchical Data Format (HDF) is a set of file formats (HDF4, HDF5) designed to store and organize large amounts of data. Originally developed at the U.S. National Center for Supercomputing Applications, it is supported by The HDF Group, a non-profit corporation whose mission is to ensure continued development of HDF5 technologies and the continued accessibility of data stored in HDF. In keeping with this goal, the HDF libraries and associated tools are available under a liberal, BSD-like license for general use. HDF is supported by many commercial and non-commercial software platforms and programming languages. The freely available HDF distribution consists of the library, command-line utilities, test suite source, Java interface, and the Java-based HDF Viewer (HDFView). The current version, HDF5, differs significantly in design and API from the major legacy version HDF4. Early history The quest for a portable scientific data format, originally dubbed AEHOO (All Encompassin ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
ICGA Journal
The ''ICGA Journal'' is a quarterly academic journal published by the International Computer Games Association. It was renamed in 2000. Its previous name was the ''ICCA Journal'' of the International Computer Chess Association, which was founded in 1977. The journal covers computer analysis on two-player games, especially games with perfect information such as chess, checkers, and Go. It has been the primary outlet for publication of articles on solved games, including the development of endgame tablebases in chess and other games. For example, John W. Romein and Henri E. Bal reported in the journal in 2002 that they had solved Awari and, in 2015, David J. Wu reported his solution for the Arimaa Challenge.{{cite journal , first=David J. , last=Wu , year=2015 , title=Designing a Winning Arimaa Program , journal=ICGA Journal , volume=38 , number=1 , pages=19–40 , doi=10.3233/ICG-2015-38104 , url=https://icosahedral.net/downloads/djwu2015arimaa.pdf From 1983 till 2015 ''ICGA J ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Zobrist Hashing
Zobrist hashing (also referred to as Zobrist keys or Zobrist signatures Bruce Moreland/ref>) is a hash function construction used in computer programs that play abstract board games, such as chess and Go, to implement transposition tables, a special kind of hash table that is indexed by a board position and used to avoid analyzing the same position more than once. Zobrist hashing is named for its inventor, Albert Lindsey Zobrist. It has also been applied as a method for recognizing substitutional alloy configurations in simulations of crystalline materials. Zobrist hashing is the first known instance of the generally useful underlying technique called tabulation hashing. Calculation of the hash value Zobrist hashing starts by randomly generating bitstrings for each possible element of a board game, i.e. for each combination of a piece and a position (in the game of chess, that's 6 pieces × 2 colors × 64 board positions, with a constant number of additional bitstrings for ca ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Kalah
Kalah is a modern variation in the ancient Mancala, Mancala family of games. The Kalah board was first patented and sold in the United States by William Julius Champion, Jr. in the 1950s. This game is sometimes also called "Kalahari", possibly by false etymology from the Kalahari Desert in Namibia. For most of its variations, Kalah is a solved game with a first-player win if both players play perfect games. The pie rule can be used to balance the first-player's advantage. Standard gameplay The game provides a Kalah board and a number of ''seeds'' or counters. The board has 6 small pits, called houses, on each side; and a big pit, called an end zone or store, at each end. The object of the game is to capture more seeds than one's opponent. #At the beginning of the game, four seeds are placed in each house. This is the traditional method. #Each player controls the six houses and the seeds on their side of the board. The player's score is the number of seeds in the store to the ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Linux Kernel
The Linux kernel is a Free and open-source software, free and open source Unix-like kernel (operating system), kernel that is used in many computer systems worldwide. The kernel was created by Linus Torvalds in 1991 and was soon adopted as the kernel for the GNU operating system (OS) which was created to be a free software, free replacement for Unix. Since the late 1990s, it has been included in many Linux distributions, operating system distributions, many of which are called Linux. One such Linux kernel operating system is Android (operating system), Android which is used in many mobile and embedded devices. Most of the kernel code is written in C (programming language), C as supported by the GNU compiler collection (GCC) which has extensions beyond standard C. The code also contains assembly language, assembly code for architecture-specific logic such as optimizing memory use and task execution. The kernel has a Modular programming, modular design such that modules can be inte ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Byte
The byte is a unit of digital information that most commonly consists of eight bits. Historically, the byte was the number of bits used to encode a single character of text in a computer and for this reason it is the smallest addressable unit of memory in many computer architectures. To disambiguate arbitrarily sized bytes from the common 8-bit definition, network protocol documents such as the Internet Protocol () refer to an 8-bit byte as an octet. Those bits in an octet are usually counted with numbering from 0 to 7 or 7 to 0 depending on the bit endianness. The size of the byte has historically been hardware-dependent and no definitive standards existed that mandated the size. Sizes from 1 to 48 bits have been used. The six-bit character code was an often-used implementation in early encoding systems, and computers using six-bit and nine-bit bytes were common in the 1960s. These systems often had memory words of 12, 18, 24, 30, 36, 48, or 60 bits, corresponding t ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Bloom Filter
In computing, a Bloom filter is a space-efficient probabilistic data structure, conceived by Burton Howard Bloom in 1970, that is used to test whether an element is a member of a set. False positive matches are possible, but false negatives are not – in other words, a query returns either "possibly in set" or "definitely not in set". Elements can be added to the set, but not removed (though this can be addressed with the counting Bloom filter variant); the more items added, the larger the probability of false positives. Bloom proposed the technique for applications where the amount of source data would require an impractically large amount of memory if "conventional" error-free hashing techniques were applied. He gave the example of a hyphenation algorithm for a dictionary of 500,000 words, out of which 90% follow simple hyphenation rules, but the remaining 10% require expensive disk accesses to retrieve specific hyphenation patterns. With sufficient core memory, an error- ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |