Baby-step Giant-step
   HOME

TheInfoList



OR:

In
group theory In abstract algebra, group theory studies the algebraic structures known as group (mathematics), groups. The concept of a group is central to abstract algebra: other well-known algebraic structures, such as ring (mathematics), rings, field ( ...
, a branch of mathematics, the baby-step giant-step is a meet-in-the-middle
algorithm In mathematics and computer science, an algorithm () is a finite sequence of Rigour#Mathematics, mathematically rigorous instructions, typically used to solve a class of specific Computational problem, problems or to perform a computation. Algo ...
for computing the
discrete logarithm In mathematics, for given real numbers a and b, the logarithm \log_b(a) is a number x such that b^x=a. Analogously, in any group G, powers b^k can be defined for all integers k, and the discrete logarithm \log_b(a) is an integer k such that b^k=a ...
or
order Order, ORDER or Orders may refer to: * A socio-political or established or existing order, e.g. World order, Ancien Regime, Pax Britannica * Categorization, the process in which ideas and objects are recognized, differentiated, and understood ...
of an element in a
finite Finite may refer to: * Finite set, a set whose cardinality (number of elements) is some natural number * Finite verb, a verb form that has a subject, usually being inflected or marked for person and/or tense or aspect * "Finite", a song by Sara Gr ...
abelian group In mathematics, an abelian group, also called a commutative group, is a group in which the result of applying the group operation to two group elements does not depend on the order in which they are written. That is, the group operation is commu ...
by
Daniel Shanks Daniel Charles Shanks (January 17, 1917 – September 6, 1996) was an American mathematician who worked primarily in numerical analysis and number theory. He was the first person to compute π to 100,000 decimal places. Life and education Shan ...
. The discrete log problem is of fundamental importance to the area of
public key cryptography Public-key cryptography, or asymmetric cryptography, is the field of cryptographic systems that use pairs of related keys. Each key pair consists of a public key and a corresponding private key. Key pairs are generated with cryptographic al ...
. Many of the most commonly used cryptography systems are based on the assumption that the discrete log is extremely difficult to compute; the more difficult it is, the more security it provides a data transfer. One way to increase the difficulty of the discrete log problem is to base the cryptosystem on a larger group.


Theory

The algorithm is based on a
space–time tradeoff space–time trade-off, also known as time–memory trade-off or the algorithmic space-time continuum in computer science is a case where an algorithm or program trades increased space usage with decreased time. Here, ''space'' refers to the d ...
. It is a fairly simple modification of trial multiplication, the naive method of finding discrete logarithms. Given a
cyclic group In abstract algebra, a cyclic group or monogenous group is a Group (mathematics), group, denoted C_n (also frequently \Z_n or Z_n, not to be confused with the commutative ring of P-adic number, -adic numbers), that is Generating set of a group, ge ...
G of order n, a generator \alpha of the group and a group element \beta, the problem is to find an integer x such that : \alpha^x = \beta\,. The baby-step giant-step algorithm is based on rewriting x: :x = im + j :m = \left\lceil \sqrt \right\rceil :0 \leq i < m :0 \leq j < m Therefore, we have: : \alpha^x = \beta\, : \alpha^ = \beta\, : \alpha^j = \beta\left(\alpha^\right)^i\, The algorithm precomputes \alpha^j for several values of j. Then it fixes an m and tries values of i in the right-hand side of the congruence above, in the manner of trial multiplication. It tests to see if the congruence is satisfied for any value of j, using the precomputed values of \alpha^j.


The algorithm

Input: A cyclic group ''G'' of order ''n'', having a generator ''α'' and an element ''β''. Output: A value ''x'' satisfying \alpha^x = \beta. # ''m'' ← Ceiling() # For all ''j'' where 0 ≤ ''j'' < ''m'': ## Compute ''α''''j'' and store the pair (''j'', ''α''''j'') in a table. (See ) # Compute ''α''−''m''. # ''γ'' ← ''β''. (set ''γ'' = ''β'') # For all ''i'' where 0 ≤ ''i'' < ''m'': ## Check to see if γ is the second component (''α''''j'') of any pair in the table. ## If so, return ''im'' + ''j''. ## If not, ''γ'' ← ''γ'' • ''α''−''m''.


In practice

The best way to speed up the baby-step giant-step algorithm is to use an efficient table lookup scheme. The best in this case is a
hash table In computer science, a hash table is a data structure that implements an associative array, also called a dictionary or simply map; an associative array is an abstract data type that maps Unique key, keys to Value (computer science), values. ...
. The hashing is done on the second component, and to perform the check in step 1 of the main loop, γ is hashed and the resulting memory address checked. Since hash tables can retrieve and add elements in O(1) time (constant time), this does not slow down the overall baby-step giant-step algorithm. The space complexity of the algorithm is O(\sqrt n), while the time complexity of the algorithm is O(\sqrt n) . This running time is better than the O(n) running time of the naive brute force calculation. The baby-step giant-step algorithm could be used by an eavesdropper to derive the private key generated in the Diffie Hellman key exchange, when the modulus is a prime number that is not too large. If the modulus is not prime, the Pohlig–Hellman algorithm has a smaller algorithmic complexity, and potentially solves the same problem.


Notes

* The baby-step giant-step algorithm is a generic algorithm. It works for every finite cyclic group. * It is not necessary to know the exact order of the group ''G'' in advance. The algorithm still works if ''n'' is merely an upper bound on the group order. * Usually the baby-step giant-step algorithm is used for groups whose order is prime. If the order of the group is composite then the Pohlig–Hellman algorithm is more efficient. * The algorithm requires O(''m'') memory. It is possible to use less memory by choosing a smaller ''m'' in the first step of the algorithm. Doing so increases the running time, which then is O(''n''/''m''). Alternatively one can use Pollard's rho algorithm for logarithms, which has about the same running time as the baby-step giant-step algorithm, but only a small memory requirement. * While this algorithm is credited to Daniel Shanks, who published the 1971 paper in which it first appears, a 1994 paper by Nechaev states that it was known to Gelfond in 1962. * There exist optimized versions of the original algorithm, such as using the collision-free truncated lookup tables of or negation maps and Montgomery's simultaneous modular inversion as proposed in.


Further reading

* H. Cohen, A course in computational algebraic number theory, Springer, 1996. * D. Shanks, Class number, a theory of factorization and genera. In Proc. Symp. Pure Math. 20, pages 415—440. AMS, Providence, R.I., 1971. * A. Stein and E. Teske, Optimized baby step-giant step methods, Journal of the Ramanujan Mathematical Society 20 (2005), no. 1, 1–32. * A. V. Sutherland
Order computations in generic groups
PhD thesis, M.I.T., 2007. * D. C. Terr, A modification of Shanks’ baby-step giant-step algorithm, Mathematics of Computation 69 (2000), 767–773.


References


External links


Baby step-Giant step – example C source code
{{Number-theoretic algorithms Group theory Number theoretic algorithms Articles with example C++ code