In mathematics, division by zero is division where the divisor (denominator) is zero. Such a division can be formally expressed as a/0 where a is the dividend (numerator). In ordinary arithmetic, the expression has no meaning, as there is no number which, when multiplied by 0, gives a (assuming a ≠ 0), and so division by zero is undefined. Since any number multiplied by zero is zero, the expression 0/0 is also undefined; when it is the form of a limit, it is an indeterminate form. Historically, one of the earliest recorded references to the mathematical impossibility of assigning a value to a/0 is contained in George Berkeley's criticism of infinitesimal calculus in 1734 in The Analyst ("ghosts of departed quantities").[1]

There are mathematical structures in which a/0 is defined for some a such as in the Riemann sphere and the projectively extended real line; however, such structures do not satisfy every ordinary rule of arithmetic (the field axioms).

In computing, a program error may result from an attempt to divide by zero. Depending on the programming environment and the type of number (e.g. floating point, integer) being divided by zero, it may generate positive or negative infinity by the IEEE 754 floating point standard, generate an exception, generate an error message, cause the program to terminate, result in a special not-a-number value,[2] or a crash.

At first glance it seems possible to define a/0 by considering the limit of a/b as b approaches 0.

For any positive a, the limit from the right is

however, the limit from the left is

and so the is undefined (the limit is also undefined for negative a).

Furthermore, there is no obvious definition of 0/0 that can be derived from considering the limit of a ratio. The limit

does not exist. Limits of the form