In
mathematics
Mathematics is a field of study that discovers and organizes methods, Mathematical theory, theories and theorems that are developed and Mathematical proof, proved for the needs of empirical sciences and mathematics itself. There are many ar ...
, particularly in
matrix theory, a permutation matrix is a square
binary matrix that has exactly one entry of 1 in each row and each column with all other entries 0.
An permutation matrix can represent a
permutation
In mathematics, a permutation of a set can mean one of two different things:
* an arrangement of its members in a sequence or linear order, or
* the act or process of changing the linear order of an ordered set.
An example of the first mean ...
of elements. Pre-
multiplying an -row matrix by a permutation matrix , forming , results in permuting the rows of , while post-multiplying an -column matrix , forming , permutes the columns of .
Every permutation matrix ''P'' is
orthogonal
In mathematics, orthogonality (mathematics), orthogonality is the generalization of the geometric notion of ''perpendicularity''. Although many authors use the two terms ''perpendicular'' and ''orthogonal'' interchangeably, the term ''perpendic ...
, with its
inverse equal to its
transpose
In linear algebra, the transpose of a Matrix (mathematics), matrix is an operator which flips a matrix over its diagonal;
that is, it switches the row and column indices of the matrix by producing another matrix, often denoted by (among other ...
:
.
Indeed, permutation matrices can be
characterized as the orthogonal matrices whose entries are all
non-negative
In mathematics, the sign of a real number is its property of being either positive, negative, or 0. Depending on local conventions, zero may be considered as having its own unique sign, having no sign, or having both positive and negative sign. ...
.
The two permutation/matrix correspondences
There are two natural one-to-one correspondences between permutations and permutation matrices, one of which works along the rows of the matrix, the other along its columns. Here is an example, starting with a permutation in two-line form at the upper left:
:
The row-based correspondence takes the permutation to the matrix
at the upper right. The first row of
has its 1 in the third column because
. More generally, we have
where
when
and
otherwise.
The column-based correspondence takes to the matrix
at the lower left. The first column of
has its 1 in the third row because
. More generally, we have
where
is 1 when
and 0 otherwise. Since the two recipes differ only by swapping ''i'' with ''j'', the matrix
is the transpose of
; and, since
is a permutation matrix, we have
. Tracing the other two sides of the big square, we have
and
.
Permutation matrices permute rows or columns
Multiplying a matrix ''M'' by either
or
on either the left or the right will permute either the rows or columns of ''M'' by either or
−1. The details are a bit tricky.
To begin with, when we permute the entries of a vector
by some permutation , we move the
entry
of the input vector into the
slot of the output vector. Which entry then ends up in, say, the first slot of the output? Answer: The entry
for which
, and hence
. Arguing similarly about each of the slots, we find that the output vector is
:
even though we are permuting by
, not by
. Thus, in order to permute the entries by
, we must permute the indices by
.
(Permuting the entries by
is sometimes called taking the ''alibi viewpoint'', while permuting the indices by
would take the ''alias viewpoint''.
)
Now, suppose that we pre-multiply some ''n''-row matrix
by the permutation matrix
. By the rule for
matrix multiplication
In mathematics, specifically in linear algebra, matrix multiplication is a binary operation that produces a matrix (mathematics), matrix from two matrices. For matrix multiplication, the number of columns in the first matrix must be equal to the n ...
, the
entry in the product
is
:
where
is 0 except when
, when it is 1. Thus, the only term in the sum that survives is the term in which
, and the sum reduces to
. Since we have permuted the row index by
, we have permuted the rows of ''M'' themselves by .
A similar argument shows that post-multiplying an ''n''-column matrix ''M'' by
permutes its columns by .
The other two options are pre-multiplying by
or post-multiplying by
, and they permute the rows or columns respectively by
−1, instead of by .
The transpose is also the inverse
A related argument proves that, as we claimed above, the transpose of any permutation matrix ''P'' also acts as its inverse, which implies that ''P'' is invertible. (Artin leaves that proof as an exercise,
which we here solve.) If
, then the
entry of its transpose
is
. The
entry of the product
is then
:
Whenever
, the
term in this sum is the product of two different entries in the
column of ''P''; so all terms are 0, and the sum is 0. When
, we are summing the squares of the entries in the
row of ''P'', so the sum is 1. The product
is thus the identity matrix. A symmetric argument shows the same for
, implying that ''P'' is invertible with
.
Multiplying permutation matrices
Given two permutations of elements and , the product of the corresponding column-based permutation matrices and is given,
as you might expect, by
where the composed permutation
applies first and then , working from right to left:
This follows because pre-multiplying some matrix by and then pre-multiplying the resulting product by gives the same result as pre-multiplying just once by the combined
.
For the row-based matrices, there is a twist: The product of and is given by
:
with applied before in the composed permutation. This happens because we must post-multiply to avoid inversions under the row-based option, so we would post-multiply first by and then by .
Some people, when applying a function to an argument, write the function after the argument (
postfix notation), rather than before it. When doing linear algebra, they work with linear spaces of row vectors, and they apply a linear map to an argument by using the map's matrix to post-multiply the argument's row vector. They often use a left-to-right composition operator, which we here denote using a semicolon; so the composition
is defined either by
:
or, more elegantly, by
:
with applied first. That notation gives us a simpler rule for multiplying row-based permutation matrices:
:
Matrix group
When is the identity permutation, which has
for all ''i'', both and are the
identity matrix
In linear algebra, the identity matrix of size n is the n\times n square matrix with ones on the main diagonal and zeros elsewhere. It has unique properties, for example when the identity matrix represents a geometric transformation, the obje ...
.
There are permutation matrices, since there are permutations and the map
is a one-to-one correspondence between permutations and permutation matrices. (The map
is another such correspondence.) By the formulas above, those permutation matrices form a
group of order under matrix multiplication, with the identity matrix as its
identity element
In mathematics, an identity element or neutral element of a binary operation is an element that leaves unchanged every element when the operation is applied. For example, 0 is an identity element of the addition of real numbers. This concept is use ...
, a group that we denote
. The group
is a subgroup of the
general linear group
In mathematics, the general linear group of degree n is the set of n\times n invertible matrices, together with the operation of ordinary matrix multiplication. This forms a group, because the product of two invertible matrices is again inve ...
of invertible matrices of real numbers. Indeed, for any
field ''F'', the group
is also a subgroup of the group
, where the matrix entries belong to ''F''. (Every field contains 0 and 1 with
and
and that's all we need to multiply permutation matrices. Different fields disagree about whether
, but that sum doesn't arise.)
Let
denote the
symmetric group
In abstract algebra, the symmetric group defined over any set is the group whose elements are all the bijections from the set to itself, and whose group operation is the composition of functions. In particular, the finite symmetric grou ...
, or
group of permutations, on where the group operation is the standard, right-to-left composition "
"; and let
denote the
opposite group, which uses the left-to-right composition "
". The map
that takes to its column-based matrix
is a
faithful representation In mathematics, especially in an area of abstract algebra known as representation theory, a faithful representation ρ of a group (mathematics), group on a vector space is a linear representation in which different elements of are represented by ...
, and similarly for the map
that takes to
.
Doubly stochastic matrices
Every permutation matrix is
doubly stochastic. The set of all doubly stochastic matrices is called the
Birkhoff polytope, and the permutation matrices play a special role in that polytope. The
Birkhoff–von Neumann theorem says that every doubly stochastic real matrix is a
convex combination of permutation matrices of the same order, with the permutation matrices being precisely the
extreme point
In mathematics, an extreme point of a convex set S in a Real number, real or Complex number, complex vector space is a point in S that does not lie in any open line segment joining two points of S. The extreme points of a line segment are calle ...
s (the vertices) of the Birkhoff polytope. The Birkhoff polytope is thus the
convex hull
In geometry, the convex hull, convex envelope or convex closure of a shape is the smallest convex set that contains it. The convex hull may be defined either as the intersection of all convex sets containing a given subset of a Euclidean space, ...
of the permutation matrices.
Linear-algebraic properties
Just as each permutation is associated with two permutation matrices, each permutation matrix is associated with two permutations, as we can see by relabeling the example in the big square above starting with the matrix ''P'' at the upper right:
:
So we are here denoting the inverse of ''C'' as
and the inverse of ''R'' as
. We can then compute the linear-algebraic properties of ''P'' from some combinatorial properties that are shared by the two permutations
and
.
A point is
fixed by
just when it is fixed by
, and the
trace of ''P'' is the number of such shared fixed points.
If the integer ''k'' is one of them, then the
standard basis
In mathematics, the standard basis (also called natural basis or canonical basis) of a coordinate vector space (such as \mathbb^n or \mathbb^n) is the set of vectors, each of whose components are all zero, except one that equals 1. For exampl ...
vector is an
eigenvector
In linear algebra, an eigenvector ( ) or characteristic vector is a vector that has its direction unchanged (or reversed) by a given linear transformation. More precisely, an eigenvector \mathbf v of a linear transformation T is scaled by ...
of ''P''.
To calculate the complex
eigenvalue
In linear algebra, an eigenvector ( ) or characteristic vector is a vector that has its direction unchanged (or reversed) by a given linear transformation. More precisely, an eigenvector \mathbf v of a linear transformation T is scaled by a ...
s of ''P'', write the permutation
as a composition of
disjoint cycles, say
. (Permutations of disjoint subsets commute, so it doesn't matter here whether we are composing right-to-left or left-to-right.) For
, let the length of the cycle
be
, and let
be the set of complex solutions of
, those solutions being the
roots of unity
In mathematics, a root of unity is any complex number that yields 1 when raised to some positive integer power . Roots of unity are used in many branches of mathematics, and are especially important in number theory, the theory of group char ...
. The
multiset union of the
is then the multiset of eigenvalues of ''P''. Since writing
as a product of cycles would give the same number of cycles of the same lengths, analyzing
would give the same result. The
multiplicity of any eigenvalue ''v'' is the number of ''i'' for which
contains ''v''.
(Since any permutation matrix is
normal and any normal matrix is
diagonalizable over the complex numbers,
the algebraic and geometric multiplicities of an eigenvalue ''v'' are the same.)
From
group theory
In abstract algebra, group theory studies the algebraic structures known as group (mathematics), groups.
The concept of a group is central to abstract algebra: other well-known algebraic structures, such as ring (mathematics), rings, field ( ...
we know that any permutation may be written as a composition of
transpositions. Therefore, any permutation matrix factors as a product of row-switching
elementary matrices, each of which has
determinant
In mathematics, the determinant is a Scalar (mathematics), scalar-valued function (mathematics), function of the entries of a square matrix. The determinant of a matrix is commonly denoted , , or . Its value characterizes some properties of the ...
−1. Thus, the determinant of the permutation matrix ''P'' is the
sign of the permutation
, which is also the sign of
.
Restricted forms
*
Costas array
In mathematics, a Costas array can be regarded geometry, geometrically as a set of ''n'' points, each at the center of a square in an ''n''×''n'' square tiling such that each row or column contains only one point, and all of the ''n''(''n''& ...
, a permutation matrix in which the displacement vectors between the entries are all distinct
*
n-queens puzzle, a permutation matrix in which there is at most one entry in each diagonal and antidiagonal
See also
*
Alternating sign matrix
*
Exchange matrix
*
Generalized permutation matrix
*
Rook polynomial
*
Permanent
References
*
*
{{Authority control
Matrices (mathematics)
Permutations
Sparse matrices