HOME





TestU01
TestU01 is a software library, implemented in the ANSI C language, that offers a collection of utilities for the empirical randomness testing of random number generators (RNGs).The TestU01 web site
The library was first introduced in 2007 by Pierre L’Ecuyer and Richard Simard of the .Pierre L’Ecuyer & Richard Simard (2007),
TestU01: A Software Library in ANSI C for Empirical Testing of Random Number Generators
, ''

Randomness Tests
A randomness test (or test for randomness), in data evaluation, is a test used to analyze the distribution of a set of data to see whether it can be described as random (patternless). In stochastic modeling, as in some computer simulations, the hoped-for randomness of potential input data can be verified, by a formal test for randomness, to show that the data are valid for use in simulation runs. In some cases, data reveals an obvious non-random pattern, as with so-called "runs in the data" (such as expecting random 0–9 but finding "4 3 2 1 0 4 3 2 1..." and rarely going above 4). If a selected set of data fails the tests, then parameters can be changed or other randomized data can be used which does pass the tests for randomness. Background The issue of randomness is an important philosophical and theoretical question. Tests for randomness can be used to determine whether a data set has a recognisable pattern, which would indicate that the process that generated it is sig ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Diehard Tests
The diehard tests are a battery of statistical tests for measuring the quality of a random number generator (RNG). They were developed by George Marsaglia over several years and first published in 1995 on a CD-ROM of random numbers. In 2006, the original diehard tests were extended into the dieharder tests. Most of the tests in DIEHARD return a ''p''-value, which should be uniform on [0,1) if the input file contains truly independent random bits. Those ''p''-values are obtained by ''p'' = ''F''(''X''), where ''F'' is the assumed distribution of the sample random variable ''X'' – often normal. But that assumed ''F'' is just an asymptotic approximation, for which the fit will be worst in the tails. Thus you should not be surprised with occasional ''p''-values near 0 or 1, such as 0.0012 or 0.9983. When a bit stream really FAILS BIG, you will get ''p''s of 0 or 1 to six or more places. Since there are many tests, it is not unlikely that a ''p'' 0.975 means that the RNG has "failed ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Software Library
In computing, a library is a collection of resources that can be leveraged during software development to implement a computer program. Commonly, a library consists of executable code such as compiled functions and classes, or a library can be a collection of source code. A resource library may contain data such as images and text. A library can be used by multiple, independent consumers (programs and other libraries). This differs from resources defined in a program which can usually only be used by that program. When a consumer uses a library resource, it gains the value of the library without having to implement it itself. Libraries encourage software reuse in a modular fashion. Libraries can use other libraries resulting in a hierarchy of libraries in a program. When writing code that uses a library, a programmer only needs to know how to use it not its internal details. For example, a program could use a library that abstracts a complicated system call so that the ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Computer Libraries
A computer is a machine that can be programmed to automatically carry out sequences of arithmetic or logical operations (''computation''). Modern digital electronic computers can perform generic sets of operations known as ''programs'', which enable computers to perform a wide range of tasks. The term computer system may refer to a nominally complete computer that includes the hardware, operating system, software, and peripheral equipment needed and used for full operation; or to a group of computers that are linked and function together, such as a computer network or computer cluster. A broad range of industrial and consumer products use computers as control systems, including simple special-purpose devices like microwave ovens and remote controls, and factory devices like industrial robots. Computers are at the core of general-purpose devices such as personal computers and mobile devices such as smartphones. Computers power the Internet, which links billions of compute ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Harvey Mudd College
Harvey Mudd College (HMC) is a private liberal arts college in Claremont, California, focused on science and engineering. It is part of the Claremont Colleges, which share adjoining campus grounds and resources. The college enrolled 902 undergraduate students and awards the Bachelor of Science degree. The college was funded by the friends and family of Harvey Seeley Mudd, one of the initial investors in the Cyprus Mines Corporation. Although involved in the planning of the new institution, Mudd died before it opened in 1955. The campus was designed by Edward Durell Stone in a modernist brutalist style. History Harvey Mudd College was founded in 1955. Classes began in 1957 with a class of 48 students, 7 faculty and one building–Mildred E. Mudd Hall, a dormitory. Classes and meals took place at Claremont Men's College (Claremont McKenna College), and labs in the Baxter Science Building until additional buildings could be built: Jacobs Science Building (1959), Thomas-Garett ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Red Hat Linux
Red Hat Linux was a widely used commercial open-source Linux distribution created by Red Hat until its discontinuation in 2004. Early releases of Red Hat Linux were called Red Hat Commercial Linux. Red Hat published the first non-beta release in May 1995. It included the Red Hat Package Manager as its packaging format, and over time RPM has served as the starting point for several other distributions, such as Mandriva Linux and Yellow Dog Linux. In 2003, Red Hat discontinued the Red Hat Linux line in favor of Red Hat Enterprise Linux (RHEL) for enterprise environments. Fedora Linux, developed by the community-supported Fedora Project and sponsored by Red Hat, is a free-of-cost alternative intended for home use. Red Hat Linux 9, the final release, hit its official end-of-life on April 30, 2004, although updates were published for it through 2006 by the Fedora Legacy project until the updates were discontinued in early 2007. Features Version 3.0.3 was one of the first Linux ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Pentium 4
Pentium 4 is a series of single-core central processing unit, CPUs for Desktop computer, desktops, laptops and entry-level Server (computing), servers manufactured by Intel. The processors were shipped from November 20, 2000 until August 8, 2008. All Pentium 4 CPUs are based on the NetBurst microarchitecture, the successor to the P6 (microarchitecture), P6. The Pentium 4 #Willamette, Willamette (180 nm) introduced SSE2, while the #Prescott, Prescott (90 nm) introduced SSE3 and later 64-bit technology. Later versions introduced Hyper-threading, Hyper-Threading Technology (HTT). The first Pentium 4-branded processor to implement x86-64, 64-bit was the Prescott (90 nm) (February 2004), but this feature was not enabled. Intel subsequently began selling 64-bit Pentium 4s using the ''"E0" revision'' of the Prescotts, being sold on the OEM market as the Pentium 4, model F. The E0 revision also adds eXecute Disable (XD) (Intel's name for the NX bit) to Intel 64. Int ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

P-value
In null-hypothesis significance testing, the ''p''-value is the probability of obtaining test results at least as extreme as the result actually observed, under the assumption that the null hypothesis is correct. A very small ''p''-value means that such an extreme observed outcome would be very unlikely ''under the null hypothesis''. Even though reporting ''p''-values of statistical tests is common practice in academic publications of many quantitative fields, misinterpretation and misuse of p-values is widespread and has been a major topic in mathematics and metascience. In 2016, the American Statistical Association (ASA) made a formal statement that "''p''-values do not measure the probability that the studied hypothesis is true, or the probability that the data were produced by random chance alone" and that "a ''p''-value, or statistical significance, does not measure the size of an effect or the importance of a result" or "evidence regarding a model or hypothesis". That ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

George Marsaglia
George Marsaglia (March 12, 1924 – February 15, 2011) was an American mathematician and computer scientist. He is best known for creating the diehard tests, a suite of software for measuring statistical randomness. Research on random numbers George Marsaglia established the lattice structure of linear congruential generators in the paper "Random numbers fall mainly in the planes", later termed Marsaglia's theorem. This phenomenon means that ''n''-tuples with coordinates obtained from consecutive use of the generator will lie on a small number of equally spaced hyperplanes in ''n''-dimensional space. He also developed the diehard tests, a series of tests to determine whether or not a sequence of numbers have the statistical properties that could be expected from a random sequence. In 1995 he published a CD-ROM of random numbers, which included the diehard tests. His diehard paper came with the quotation "Nothing is random, only uncertain" attributed to ''Gail Gasram'', ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


ANSI C
ANSI C, ISO C, and Standard C are successive standards for the C programming language published by the American National Standards Institute (ANSI) and ISO/IEC JTC 1/SC 22/WG 14 of the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC). Historically, the names referred specifically to the original and best-supported version of the standard (known as C89 or C90). Software developers writing in C are encouraged to conform to the standards, as doing so helps portability between compilers. History and outlook The first standard for C was published by ANSI. Although this document was subsequently adopted by ISO/IEC and subsequent revisions published by ISO/IEC have been adopted by ANSI, "ANSI C" is still used to refer to the standard. While some software developers use the term ISO C, others are standards-body neutral and use Standard C. Informal specification: K&R C (''C78'') Informal specification in 1978 (Brian Kernig ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]