An independent equation is an
equation in a
system of simultaneous equations which cannot be derived algebraically from the other equations. The concept typically arises in the context of
linear equation
In mathematics, a linear equation is an equation that may be put in the form
a_1x_1+\ldots+a_nx_n+b=0, where x_1,\ldots,x_n are the variables (or unknowns), and b,a_1,\ldots,a_n are the coefficients, which are often real numbers. The coeffici ...
s. If it is possible to duplicate one of the equations in a system by multiplying each of the other equations by some number (potentially a different number for each equation) and summing the resulting equations, then that equation is dependent on the others. But if this is not possible, then that equation is independent of the others.
If an equation is independent of the other equations in its system, then it provides information beyond that which is provided by the other equations. In contrast, if an equation is dependent on the others, then it provides no information not contained in the others collectively, and the equation can be dropped from the system without any information loss.

The number of independent equations in a system equals the rank of the
augmented matrix
In linear algebra, an augmented matrix is a matrix obtained by appending the columns of two given matrices, usually for the purpose of performing the same elementary row operations on each of the given matrices.
Given the matrices and , where
A ...
of the system—the system's
coefficient matrix In linear algebra, a coefficient matrix is a matrix consisting of the coefficients of the variables in a set of linear equations. The matrix is used in solving systems of linear equations.
Coefficient matrix
In general, a system with ''m'' linear ...
with one additional column appended, that column being the
column vector
In linear algebra, a column vector with m elements is an m \times 1 matrix consisting of a single column of m entries, for example,
\boldsymbol = \begin x_1 \\ x_2 \\ \vdots \\ x_m \end.
Similarly, a row vector is a 1 \times n matrix for some n, ...
of constants.
The number of independent equations in a system of
consistent equations (a system that has at least one solution) can never be greater than the number of unknowns. Equivalently, if a system has more independent equations than unknowns, it is inconsistent and has no solutions.
See also
*