HOME

TheInfoList



OR:

In
mathematics Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in modern mathematics ...
, the spectral radius of a
square matrix In mathematics, a square matrix is a matrix with the same number of rows and columns. An ''n''-by-''n'' matrix is known as a square matrix of order Any two square matrices of the same order can be added and multiplied. Square matrices are often ...
is the maximum of the absolute values of its
eigenvalues In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted b ...
. More generally, the spectral radius of a bounded linear operator is the
supremum In mathematics, the infimum (abbreviated inf; plural infima) of a subset S of a partially ordered set P is a greatest element in P that is less than or equal to each element of S, if such an element exists. Consequently, the term ''greatest l ...
of the absolute values of the elements of its spectrum. The spectral radius is often denoted by .


Definition


Matrices

Let be the eigenvalues of a matrix . The spectral radius of is defined as :\rho(A) = \max \left \. The spectral radius can be thought of as an infimum of all norms of a matrix. Indeed, on the one hand, \rho(A) \leqslant \, A\, for every natural matrix norm \, \cdot\, ; and on the other hand, Gelfand's formula states that \rho(A) = \lim_ \, A^k\, ^ . Both of these results are shown below. However, the spectral radius does not necessarily satisfy \, A\mathbf\, \leqslant \rho(A) \, \mathbf\, for arbitrary vectors \mathbf \in \mathbb^n . To see why, let r > 1 be arbitrary and consider the matrix : C_r = \begin 0 & r^ \\ r & 0 \end . The
characteristic polynomial In linear algebra, the characteristic polynomial of a square matrix is a polynomial which is invariant under matrix similarity and has the eigenvalues as roots. It has the determinant and the trace of the matrix among its coefficients. The chara ...
of C_r is \lambda^2 - 1 , so its eigenvalues are \ and thus \rho(C_r) = 1. However, C_r \mathbf_1 = r \mathbf_2. As a result, : \, C_r \mathbf_1 \, = r > 1 = \rho(C_r) \, \mathbf_1\, . As an illustration of Gelfand's formula, note that \, C_r^k\, ^ \to 1 as k \to \infty, since C_r^k = I if k is even and C_r^k = C_r if k is odd. A special case in which \, A\mathbf\, \leqslant \rho(A) \, \mathbf\, for all \mathbf \in \mathbb^n is when A is a Hermitian matrix and \, \cdot\, is the
Euclidean norm Euclidean space is the fundamental space of geometry, intended to represent physical space. Originally, that is, in Euclid's ''Elements'', it was the three-dimensional space of Euclidean geometry, but in modern mathematics there are Euclidean s ...
. This is because any Hermitian Matrix is diagonalizable by a unitary matrix, and unitary matrices preserve vector length. As a result, : \, A\mathbf\, = \, U^*DU\mathbf\, = \, DU\mathbf\, \leqslant \rho(A) \, U\mathbf\, = \rho(A) \, \mathbf\, .


Bounded linear operators

In the context of a bounded linear operator on a
Banach space In mathematics, more specifically in functional analysis, a Banach space (pronounced ) is a complete normed vector space. Thus, a Banach space is a vector space with a metric that allows the computation of vector length and distance between vector ...
, the eigenvalues need to be replaced with the elements of the spectrum of the operator, i.e. the values \lambda for which A - \lambda I is not bijective. We denote the spectrum by :\sigma(A) = \left\ The spectral radius is then defined as the supremum of the magnitudes of the elements of the spectrum: :\rho(A) = \sup_ , \lambda, Gelfand's formula, also known as the spectral radius formula, also holds for bounded linear operators: letting \, \cdot\, denote the operator norm, we have :\rho(A) = \lim_\, A^k\, ^=\inf_ \, A^k\, ^. A bounded operator (on a complex Hilbert space) is called a spectraloid operator if its spectral radius coincides with its numerical radius. An example of such an operator is a normal operator.


Graphs

The spectral radius of a finite graph is defined to be the spectral radius of its adjacency matrix. This definition extends to the case of infinite graphs with bounded degrees of vertices (i.e. there exists some real number such that the degree of every vertex of the graph is smaller than ). In this case, for the graph define: : \ell^2(G) = \left \. Let be the adjacency operator of : : \begin \gamma : \ell^2(G) \to \ell^2(G) \\ (\gamma f)(v) = \sum_ f(u) \end The spectral radius of is defined to be the spectral radius of the bounded linear operator .


Upper bounds


Upper bounds on the spectral radius of a matrix

The following proposition gives simple yet useful upper bounds on the spectral radius of a matrix. Proposition. Let with spectral radius and a consistent matrix norm . Then for each integer k \geqslant 1: ::\rho(A)\leq \, A^k\, ^. Proof Let be an eigenvector- eigenvalue pair for a matrix ''A''. By the sub-multiplicativity of the matrix norm, we get: :, \lambda, ^k\, \mathbf\, = \, \lambda^k \mathbf\, = \, A^k \mathbf\, \leq \, A^k\, \cdot\, \mathbf\, . Since , we have :, \lambda, ^k \leq \, A^k\, and therefore :\rho(A)\leq \, A^k\, ^. concluding the proof.


Upper bounds for spectral radius of a graph

There are many upper bounds for the spectral radius of a graph in terms of its number ''n'' of vertices and its number ''m'' of edges. For instance, if :\frac \leq m-n \leq \frac where 3 \le k \le n is an integer, then :\rho(G) \leq \sqrt


Power sequence

The spectral radius is closely related to the behavior of the convergence of the power sequence of a matrix; namely as shown by the following theorem. Theorem. Let with spectral radius . Then if and only if :\lim_ A^k = 0. On the other hand, if , \lim_ \, A^k\, = \infty. The statement holds for any choice of matrix norm on . Proof Assume that A^k goes to zero as k goes to infinity. We will show that . Let be an eigenvector- eigenvalue pair for ''A''. Since , we have :\begin 0 &= \left(\lim_ A^k \right) \mathbf \\ &= \lim_ \left(A^k\mathbf \right ) \\ &= \lim_ \lambda^k\mathbf \\ &= \mathbf \lim_ \lambda^k \end Since by hypothesis, we must have :\lim_\lambda^k = 0, which implies , λ, < 1. Since this must be true for any eigenvalue λ, we can conclude that . Now, assume the radius of is less than . From the Jordan normal form theorem, we know that for all , there exist with non-singular and block diagonal such that: :A = VJV^ with :J=\begin J_(\lambda_1) & 0 & 0 & \cdots & 0 \\ 0 & J_(\lambda_2) & 0 & \cdots & 0 \\ \vdots & \cdots & \ddots & \cdots & \vdots \\ 0 & \cdots & 0 & J_(\lambda_) & 0 \\ 0 & \cdots & \cdots & 0 & J_(\lambda_s) \end where :J_(\lambda_i)=\begin \lambda_i & 1 & 0 & \cdots & 0 \\ 0 & \lambda_i & 1 & \cdots & 0 \\ \vdots & \vdots & \ddots & \ddots & \vdots \\ 0 & 0 & \cdots & \lambda_i & 1 \\ 0 & 0 & \cdots & 0 & \lambda_i \end\in \mathbf^, 1\leq i\leq s. It is easy to see that :A^k=VJ^kV^ and, since is block-diagonal, :J^k=\begin J_^k(\lambda_1) & 0 & 0 & \cdots & 0 \\ 0 & J_^k(\lambda_2) & 0 & \cdots & 0 \\ \vdots & \cdots & \ddots & \cdots & \vdots \\ 0 & \cdots & 0 & J_^k(\lambda_) & 0 \\ 0 & \cdots & \cdots & 0 & J_^k(\lambda_s) \end Now, a standard result on the -power of an m_i \times m_i Jordan block states that, for k \geq m_i-1: :J_^k(\lambda_i)=\begin \lambda_i^k & \lambda_i^ & \lambda_i^ & \cdots & \lambda_i^ \\ 0 & \lambda_i^k & \lambda_i^ & \cdots & \lambda_i^ \\ \vdots & \vdots & \ddots & \ddots & \vdots \\ 0 & 0 & \cdots & \lambda_i^k & \lambda_i^ \\ 0 & 0 & \cdots & 0 & \lambda_i^k \end Thus, if \rho(A) < 1 then for all , \lambda_i, < 1. Hence for all we have: :\lim_J_^k=0 which implies :\lim_ J^k = 0. Therefore, :\lim_A^k=\lim_VJ^kV^=V \left (\lim_J^k \right )V^=0 On the other side, if \rho(A)>1, there is at least one element in that does not remain bounded as increases, thereby proving the second part of the statement.


Gelfand's formula

Gelfand's formula, named after Israel Gelfand, gives the spectral radius as a limit of matrix norms.


Theorem

For any matrix norm we haveThe formula holds for any Banach algebra; see Lemma IX.1.8 in and :\rho(A)=\lim_ \left \, A^k \right \, ^. Moreover, in the case of a consistent matrix norm \lim_ \left \, A^k \right \, ^ approaches \rho(A) from above (indeed, in that case \rho(A) \leq \left \, A^k \right \, ^ for all k).


Proof

For any , let us define the two following matrices: :A_= \fracA. Thus, :\rho \left (A_ \right ) = \frac, \qquad \rho (A_+) < 1 < \rho (A_-). We start by applying the previous theorem on limits of power sequences to : :\lim_ A_+^k=0. This shows the existence of such that, for all , :\left\, A_+^k \right \, < 1. Therefore, :\left \, A^k \right \, ^ < \rho(A)+\varepsilon. Similarly, the theorem on power sequences implies that \, A_-^k\, is not bounded and that there exists such that, for all ''k'' ≥ N, :\left\, A_-^k \right \, > 1. Therefore, :\left\, A^k \right\, ^ > \rho(A)-\varepsilon. Let . Then, :\forall \varepsilon>0\quad \exists N\in\mathbf \quad \forall k\geq N \quad \rho(A)-\varepsilon < \left \, A^k \right \, ^ < \rho(A)+\varepsilon, that is, :\lim_ \left \, A^k \right \, ^ = \rho(A). This concludes the proof.


Corollary

Gelfand's formula yields a bound on the spectral radius of a product of commuting matrices: if A_1, \ldots, A_n are matrices that all commute, then :\rho(A_1 \cdots A_n) \leq \rho(A_1) \cdots \rho(A_n).


Numerical example

Consider the matrix :A=\begin 9 & -1 & 2\\ -2 & 8 & 4\\ 1 & 1 & 8 \end whose eigenvalues are ; by definition, . In the following table, the values of \, A^k\, ^ for the four most used norms are listed versus several increasing values of k (note that, due to the particular form of this matrix,\, .\, _1=\, .\, _\infty):


Notes and references


Bibliography

* *


See also

* Spectral gap * The Joint spectral radius is a generalization of the spectral radius to sets of matrices. *
Spectrum of a matrix In mathematics, the spectrum of a matrix is the set of its eigenvalues. More generally, if T\colon V \to V is a linear operator on any finite-dimensional vector space, its spectrum is the set of scalars \lambda such that T-\lambda I is not invertib ...
* Spectral abscissa {{SpectralTheory Spectral theory Articles containing proofs