
In
linear algebra
Linear algebra is the branch of mathematics concerning linear equations such as
:a_1x_1+\cdots +a_nx_n=b,
linear maps such as
:(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n,
and their representations in vector spaces and through matrix (mathemat ...
, the column space (also called the range or
image) of a
matrix
Matrix (: matrices or matrixes) or MATRIX may refer to:
Science and mathematics
* Matrix (mathematics), a rectangular array of numbers, symbols or expressions
* Matrix (logic), part of a formula in prenex normal form
* Matrix (biology), the m ...
''A'' is the
span (set of all possible
linear combination
In mathematics, a linear combination or superposition is an Expression (mathematics), expression constructed from a Set (mathematics), set of terms by multiplying each term by a constant and adding the results (e.g. a linear combination of ''x'' a ...
s) of its
column vector
In linear algebra, a column vector with elements is an m \times 1 matrix consisting of a single column of entries, for example,
\boldsymbol = \begin x_1 \\ x_2 \\ \vdots \\ x_m \end.
Similarly, a row vector is a 1 \times n matrix for some , c ...
s. The column space of a matrix is the
image
An image or picture is a visual representation. An image can be Two-dimensional space, two-dimensional, such as a drawing, painting, or photograph, or Three-dimensional space, three-dimensional, such as a carving or sculpture. Images may be di ...
or
range of the corresponding
matrix transformation.
Let
be a
field. The column space of an matrix with components from
is a
linear subspace
In mathematics, the term ''linear'' is used in two distinct senses for two different properties:
* linearity of a ''function (mathematics), function'' (or ''mapping (mathematics), mapping'');
* linearity of a ''polynomial''.
An example of a li ...
of the
''m''-space . The
dimension
In physics and mathematics, the dimension of a mathematical space (or object) is informally defined as the minimum number of coordinates needed to specify any point within it. Thus, a line has a dimension of one (1D) because only one coo ...
of the column space is called the
rank of the matrix and is at most .
[Linear algebra, as discussed in this article, is a very well established mathematical discipline for which there are many sources. Almost all of the material in this article can be found in Lay 2005, Meyer 2001, and Strang 2005.] A definition for matrices over a
ring is also possible.
The row space is defined similarly.
The row space and the column space of a matrix are sometimes denoted as and respectively.
This article considers matrices of
real number
In mathematics, a real number is a number that can be used to measure a continuous one- dimensional quantity such as a duration or temperature. Here, ''continuous'' means that pairs of values can have arbitrarily small differences. Every re ...
s. The row and column spaces are subspaces of the
real spaces and
respectively.
Overview
Let be an -by- matrix. Then
* ,
* = number of
pivots in any echelon form of ,
* = the maximum number of linearly independent rows or columns of .
If the matrix represents a
linear transformation
In mathematics, and more specifically in linear algebra, a linear map (also called a linear mapping, linear transformation, vector space homomorphism, or in some contexts linear function) is a mapping V \to W between two vector spaces that pr ...
, the column space of the matrix equals the
image
An image or picture is a visual representation. An image can be Two-dimensional space, two-dimensional, such as a drawing, painting, or photograph, or Three-dimensional space, three-dimensional, such as a carving or sculpture. Images may be di ...
of this linear transformation.
The column space of a matrix is the set of all linear combinations of the columns in . If , then .
Given a matrix , the action of the matrix on a vector returns a linear combination of the columns of with the coordinates of as coefficients; that is, the columns of the matrix generate the column space.
Example
Given a matrix :
:
the rows are
,
,
,
.
Consequently, the row space of is the subspace of
spanned by .
Since these four row vectors are
linearly independent
In the theory of vector spaces, a set of vectors is said to be if there exists no nontrivial linear combination of the vectors that equals the zero vector. If such a linear combination exists, then the vectors are said to be . These concep ...
, the row space is 4-dimensional. Moreover, in this case it can be seen that they are all
orthogonal
In mathematics, orthogonality (mathematics), orthogonality is the generalization of the geometric notion of ''perpendicularity''. Although many authors use the two terms ''perpendicular'' and ''orthogonal'' interchangeably, the term ''perpendic ...
to the vector ( is an element of the
kernel of ), so it can be deduced that the row space consists of all vectors in
that are orthogonal to .
Column space
Definition
Let be a
field of
scalars. Let be an matrix, with column vectors . A
linear combination
In mathematics, a linear combination or superposition is an Expression (mathematics), expression constructed from a Set (mathematics), set of terms by multiplying each term by a constant and adding the results (e.g. a linear combination of ''x'' a ...
of these vectors is any vector of the form
:
where are scalars. The set of all possible linear combinations of is called the column space of . That is, the column space of is the
span of the vectors .
Any linear combination of the column vectors of a matrix can be written as the product of with a column vector:
:
Therefore, the column space of consists of all possible products , for . This is the same as the
image
An image or picture is a visual representation. An image can be Two-dimensional space, two-dimensional, such as a drawing, painting, or photograph, or Three-dimensional space, three-dimensional, such as a carving or sculpture. Images may be di ...
(or
range) of the corresponding
matrix transformation.
Example
If
, then the column vectors are and .
A linear combination of v
1 and v
2 is any vector of the form
The set of all such vectors is the column space of . In this case, the column space is precisely the set of vectors satisfying the equation (using
Cartesian coordinates
In geometry, a Cartesian coordinate system (, ) in a plane is a coordinate system that specifies each point uniquely by a pair of real numbers called ''coordinates'', which are the signed distances to the point from two fixed perpendicular o ...
, this set is a
plane through the origin in
three-dimensional space
In geometry, a three-dimensional space (3D space, 3-space or, rarely, tri-dimensional space) is a mathematical space in which three values ('' coordinates'') are required to determine the position of a point. Most commonly, it is the three- ...
).
Basis
The columns of span the column space, but they may not form a
basis if the column vectors are not
linearly independent
In the theory of vector spaces, a set of vectors is said to be if there exists no nontrivial linear combination of the vectors that equals the zero vector. If such a linear combination exists, then the vectors are said to be . These concep ...
. Fortunately,
elementary row operations
In mathematics, an elementary matrix is a square matrix obtained from the application of a single elementary row operation to the identity matrix. The elementary matrices generate the general linear group when is a field. Left multiplication ...
do not affect the dependence relations between the column vectors. This makes it possible to use
row reduction to find a
basis for the column space.
For example, consider the matrix
:
The columns of this matrix span the column space, but they may not be
linearly independent
In the theory of vector spaces, a set of vectors is said to be if there exists no nontrivial linear combination of the vectors that equals the zero vector. If such a linear combination exists, then the vectors are said to be . These concep ...
, in which case some subset of them will form a basis. To find this basis, we reduce to
reduced row echelon form:
:
At this point, it is clear that the first, second, and fourth columns are linearly independent, while the third column is a linear combination of the first two. (Specifically, .) Therefore, the first, second, and fourth columns of the original matrix are a basis for the column space:
:
Note that the independent columns of the reduced row echelon form are precisely the columns with
pivots. This makes it possible to determine which columns are linearly independent by reducing only to
echelon form.
The above algorithm can be used in general to find the dependence relations between any set of vectors, and to pick out a basis from any spanning set. Also finding a basis for the column space of is equivalent to finding a basis for the row space of the
transpose
In linear algebra, the transpose of a Matrix (mathematics), matrix is an operator which flips a matrix over its diagonal;
that is, it switches the row and column indices of the matrix by producing another matrix, often denoted by (among other ...
matrix .
To find the basis in a practical setting (e.g., for large matrices), the
singular-value decomposition is typically used.
Dimension
The
dimension
In physics and mathematics, the dimension of a mathematical space (or object) is informally defined as the minimum number of coordinates needed to specify any point within it. Thus, a line has a dimension of one (1D) because only one coo ...
of the column space is called the
rank of the matrix. The rank is equal to the number of pivots in the
reduced row echelon form, and is the maximum number of linearly independent columns that can be chosen from the matrix. For example, the 4 × 4 matrix in the example above has rank three.
Because the column space is the
image
An image or picture is a visual representation. An image can be Two-dimensional space, two-dimensional, such as a drawing, painting, or photograph, or Three-dimensional space, three-dimensional, such as a carving or sculpture. Images may be di ...
of the corresponding
matrix transformation, the rank of a matrix is the same as the dimension of the image. For example, the transformation
described by the matrix above maps all of
to some three-dimensional
subspace.
The nullity of a matrix is the dimension of the
null space, and is equal to the number of columns in the reduced row echelon form that do not have pivots. The rank and nullity of a matrix with columns are related by the equation:
:
This is known as the
rank–nullity theorem.
Relation to the left null space
The
left null space of is the set of all vectors such that . It is the same as the
null space of the
transpose
In linear algebra, the transpose of a Matrix (mathematics), matrix is an operator which flips a matrix over its diagonal;
that is, it switches the row and column indices of the matrix by producing another matrix, often denoted by (among other ...
of . The product of the matrix and the vector can be written in terms of the
dot product
In mathematics, the dot product or scalar productThe term ''scalar product'' means literally "product with a Scalar (mathematics), scalar as a result". It is also used for other symmetric bilinear forms, for example in a pseudo-Euclidean space. N ...
of vectors:
:
because
row vector
In linear algebra, a column vector with elements is an m \times 1 matrix consisting of a single column of entries, for example,
\boldsymbol = \begin x_1 \\ x_2 \\ \vdots \\ x_m \end.
Similarly, a row vector is a 1 \times n matrix for some , co ...
s of are transposes of column vectors of . Thus if and only if is
orthogonal
In mathematics, orthogonality (mathematics), orthogonality is the generalization of the geometric notion of ''perpendicularity''. Although many authors use the two terms ''perpendicular'' and ''orthogonal'' interchangeably, the term ''perpendic ...
(perpendicular) to each of the column vectors of .
It follows that the left null space (the null space of ) is the
orthogonal complement
In the mathematical fields of linear algebra and functional analysis, the orthogonal complement of a subspace W of a vector space V equipped with a bilinear form B is the set W^\perp of all vectors in V that are orthogonal to every vector in W. I ...
to the column space of .
For a matrix , the column space, row space, null space, and left null space are sometimes referred to as the ''four fundamental subspaces''.
For matrices over a ring
Similarly the column space (sometimes disambiguated as ''right'' column space) can be defined for matrices over a
ring as
:
for any , with replacement of the vector -space with "
right
Rights are law, legal, social, or ethics, ethical principles of freedom or Entitlement (fair division), entitlement; that is, rights are the fundamental normative rules about what is allowed of people or owed to people according to some legal sy ...
free module
In mathematics, a free module is a module that has a ''basis'', that is, a generating set that is linearly independent. Every vector space is a free module, but, if the ring of the coefficients is not a division ring (not a field in the commu ...
", which changes the order of
scalar multiplication
In mathematics, scalar multiplication is one of the basic operations defining a vector space in linear algebra (or more generally, a module in abstract algebra). In common geometrical contexts, scalar multiplication of a real Euclidean vector ...
of the vector to the scalar such that it is written in an unusual order ''vector''–''scalar''.
[Important only if is not ]commutative
In mathematics, a binary operation is commutative if changing the order of the operands does not change the result. It is a fundamental property of many binary operations, and many mathematical proofs depend on it. Perhaps most familiar as a pr ...
. Actually, this form is merely a product of the matrix to the column vector from where the order of factors is ''preserved'', unlike the formula above.
Row space
Definition
Let be a
field of
scalars. Let be an matrix, with row vectors . A
linear combination
In mathematics, a linear combination or superposition is an Expression (mathematics), expression constructed from a Set (mathematics), set of terms by multiplying each term by a constant and adding the results (e.g. a linear combination of ''x'' a ...
of these vectors is any vector of the form
:
where are scalars. The set of all possible linear combinations of is called the row space of . That is, the row space of is the
span of the vectors .
For example, if
:
then the row vectors are and . A linear combination of and is any vector of the form
:
The set of all such vectors is the row space of . In this case, the row space is precisely the set of vectors satisfying the equation (using
Cartesian coordinates
In geometry, a Cartesian coordinate system (, ) in a plane is a coordinate system that specifies each point uniquely by a pair of real numbers called ''coordinates'', which are the signed distances to the point from two fixed perpendicular o ...
, this set is a
plane through the origin in
three-dimensional space
In geometry, a three-dimensional space (3D space, 3-space or, rarely, tri-dimensional space) is a mathematical space in which three values ('' coordinates'') are required to determine the position of a point. Most commonly, it is the three- ...
).
For a matrix that represents a homogeneous
system of linear equations
In mathematics, a system of linear equations (or linear system) is a collection of two or more linear equations involving the same variable (math), variables.
For example,
: \begin
3x+2y-z=1\\
2x-2y+4z=-2\\
-x+\fracy-z=0
\end
is a system of th ...
, the row space consists of all linear equations that follow from those in the system.
The column space of is equal to the row space of .
Basis
The row space is not affected by
elementary row operations
In mathematics, an elementary matrix is a square matrix obtained from the application of a single elementary row operation to the identity matrix. The elementary matrices generate the general linear group when is a field. Left multiplication ...
. This makes it possible to use
row reduction to find a
basis for the row space.
For example, consider the matrix
:
The rows of this matrix span the row space, but they may not be
linearly independent
In the theory of vector spaces, a set of vectors is said to be if there exists no nontrivial linear combination of the vectors that equals the zero vector. If such a linear combination exists, then the vectors are said to be . These concep ...
, in which case the rows will not be a basis. To find a basis, we reduce to
row echelon form:
, , represents the rows.
:
Once the matrix is in echelon form, the nonzero rows are a basis for the row space. In this case, the basis is . Another possible basis comes from a further reduction.
[The example is valid over the ]real number
In mathematics, a real number is a number that can be used to measure a continuous one- dimensional quantity such as a duration or temperature. Here, ''continuous'' means that pairs of values can have arbitrarily small differences. Every re ...
s, the rational number
In mathematics, a rational number is a number that can be expressed as the quotient or fraction of two integers, a numerator and a non-zero denominator . For example, is a rational number, as is every integer (for example,
The set of all ...
s, and other number field
In mathematics, an algebraic number field (or simply number field) is an extension field K of the field of rational numbers such that the field extension K / \mathbb has finite degree (and hence is an algebraic field extension).
Thus K is a ...
s. It is not necessarily correct over fields and rings with non-zero characteristic.
This algorithm can be used in general to find a basis for the span of a set of vectors. If the matrix is further simplified to
reduced row echelon form, then the resulting basis is uniquely determined by the row space.
It is sometimes convenient to find a basis for the row space from among the rows of the original matrix instead (for example, this result is useful in giving an elementary proof that the
determinantal rank of a matrix is equal to its rank). Since row operations can affect linear dependence relations of the row vectors, such a basis is instead found indirectly using the fact that the column space of is equal to the row space of . Using the example matrix above, find and reduce it to row echelon form:
:
The pivots indicate that the first two columns of form a basis of the column space of . Therefore, the first two rows of (before any row reductions) also form a basis of the row space of .
Dimension
The
dimension
In physics and mathematics, the dimension of a mathematical space (or object) is informally defined as the minimum number of coordinates needed to specify any point within it. Thus, a line has a dimension of one (1D) because only one coo ...
of the row space is called the
rank of the matrix. This is the same as the maximum number of linearly independent rows that can be chosen from the matrix, or equivalently the number of pivots. For example, the 3 × 3 matrix in the example above has rank two.
The rank of a matrix is also equal to the dimension of the
column space. The dimension of the
null space is called the nullity of the matrix, and is related to the rank by the following equation:
:
where is the number of columns of the matrix . The equation above is known as the
rank–nullity theorem.
Relation to the null space
The
null space of matrix is the set of all vectors for which . The product of the matrix and the vector can be written in terms of the
dot product
In mathematics, the dot product or scalar productThe term ''scalar product'' means literally "product with a Scalar (mathematics), scalar as a result". It is also used for other symmetric bilinear forms, for example in a pseudo-Euclidean space. N ...
of vectors:
:
where are the row vectors of . Thus if and only if is
orthogonal
In mathematics, orthogonality (mathematics), orthogonality is the generalization of the geometric notion of ''perpendicularity''. Although many authors use the two terms ''perpendicular'' and ''orthogonal'' interchangeably, the term ''perpendic ...
(perpendicular) to each of the row vectors of .
It follows that the null space of is the
orthogonal complement
In the mathematical fields of linear algebra and functional analysis, the orthogonal complement of a subspace W of a vector space V equipped with a bilinear form B is the set W^\perp of all vectors in V that are orthogonal to every vector in W. I ...
to the row space. For example, if the row space is a plane through the origin in three dimensions, then the null space will be the perpendicular line through the origin. This provides a proof of the
rank–nullity theorem (see
dimension
In physics and mathematics, the dimension of a mathematical space (or object) is informally defined as the minimum number of coordinates needed to specify any point within it. Thus, a line has a dimension of one (1D) because only one coo ...
above).
The row space and null space are two of the
four fundamental subspaces associated with a matrix (the other two being the
column space and
left null space).
Relation to coimage
If and are
vector spaces, then the
kernel of a
linear transformation
In mathematics, and more specifically in linear algebra, a linear map (also called a linear mapping, linear transformation, vector space homomorphism, or in some contexts linear function) is a mapping V \to W between two vector spaces that pr ...
is the set of vectors for which . The kernel of a linear transformation is analogous to the null space of a matrix.
If is an
inner product space
In mathematics, an inner product space (or, rarely, a Hausdorff pre-Hilbert space) is a real vector space or a complex vector space with an operation called an inner product. The inner product of two vectors in the space is a scalar, ofte ...
, then the orthogonal complement to the kernel can be thought of as a generalization of the row space. This is sometimes called the
coimage In algebra, the coimage of a homomorphism
:f : A \rightarrow B
is the quotient
:\text f = A/\ker(f)
of the domain by the kernel.
The coimage is canonically isomorphic to the image by the first isomorphism theorem, when that theorem applies ...
of . The transformation is one-to-one on its coimage, and the coimage maps
isomorphically onto the
image
An image or picture is a visual representation. An image can be Two-dimensional space, two-dimensional, such as a drawing, painting, or photograph, or Three-dimensional space, three-dimensional, such as a carving or sculpture. Images may be di ...
of .
When is not an inner product space, the coimage of can be defined as the
quotient space .
See also
*
Euclidean subspace
References & Notes
Further reading
*
*
*
*
*
*
*
*
*
External links
*
*
*
MIT Linear Algebra Lecture on the Four Fundamental Subspacesat Google Video, from
MIT OpenCourseWareKhan Academy video tutorialLecture on column space and nullspace by Gilbert Strang of MIT
{{linear algebra
Abstract algebra
Linear algebra
Matrices (mathematics)