HOME

TheInfoList



OR:

In
mathematics Mathematics is a field of study that discovers and organizes methods, Mathematical theory, theories and theorems that are developed and Mathematical proof, proved for the needs of empirical sciences and mathematics itself. There are many ar ...
, particularly
matrix theory In mathematics, a matrix (: matrices) is a rectangular array or table of numbers, symbols, or expressions, with elements or entries arranged in rows and columns, which is used to represent a mathematical object or property of such an object. ...
, a band matrix or banded matrix is a
sparse matrix In numerical analysis and scientific computing, a sparse matrix or sparse array is a matrix in which most of the elements are zero. There is no strict definition regarding the proportion of zero-value elements for a matrix to qualify as sparse ...
whose non-zero entries are confined to a diagonal ''band'', comprising the
main diagonal In linear algebra, the main diagonal (sometimes principal diagonal, primary diagonal, leading diagonal, major diagonal, or good diagonal) of a matrix A is the list of entries a_ where i = j. All off-diagonal elements are zero in a diagonal matrix ...
and zero or more diagonals on either side.


Band matrix


Bandwidth

Formally, consider an ''n''×''n'' matrix ''A''=(''a''''i,j'' ). If all matrix elements are zero outside a diagonally bordered band whose range is determined by constants ''k''1 and ''k''2: :a_=0 \quad\mbox\quad ji+k_2; \quad k_1, k_2 \ge 0.\, then the quantities ''k''1 and ''k''2 are called the and , respectively. The of the matrix is the maximum of ''k''1 and ''k''2; in other words, it is the number ''k'' such that a_=0 if , i-j, > k .


Examples

*A band matrix with ''k''1 = ''k''2 = 0 is a
diagonal matrix In linear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero; the term usually refers to square matrices. Elements of the main diagonal can either be zero or nonzero. An example of a 2×2 diagon ...
, with bandwidth 0. *A band matrix with ''k''1 = ''k''2 = 1 is a
tridiagonal matrix In linear algebra, a tridiagonal matrix is a band matrix that has nonzero elements only on the main diagonal, the subdiagonal/lower diagonal (the first diagonal below this), and the supradiagonal/upper diagonal (the first diagonal above the main ...
, with bandwidth 1. *For ''k''1 = ''k''2 = 2 one has a pentadiagonal matrix and so on. * Triangular matrices **For ''k''1 = 0, ''k''2 = ''n''−1, one obtains the definition of an upper
triangular matrix In mathematics, a triangular matrix is a special kind of square matrix. A square matrix is called if all the entries ''above'' the main diagonal are zero. Similarly, a square matrix is called if all the entries ''below'' the main diagonal are z ...
**similarly, for ''k''1 = ''n''−1, ''k''2 = 0 one obtains a lower triangular matrix. * Upper and lower Hessenberg matrices * Toeplitz matrices when bandwidth is limited. * Block diagonal matrices * Shift matrices and shear matrices * Matrices in
Jordan normal form \begin \lambda_1 1\hphantom\hphantom\\ \hphantom\lambda_1 1\hphantom\\ \hphantom\lambda_1\hphantom\\ \hphantom\lambda_2 1\hphantom\hphantom\\ \hphantom\hphantom\lambda_2\hphantom\\ \hphantom\lambda_3\hphantom\\ \hphantom\ddots\hphantom\\ ...
* A skyline matrix, also called "variable band matrix"a generalization of band matrix * The inverses of Lehmer matrices are constant tridiagonal matrices, and are thus band matrices.


Applications

In
numerical analysis Numerical analysis is the study of algorithms that use numerical approximation (as opposed to symbolic computation, symbolic manipulations) for the problems of mathematical analysis (as distinguished from discrete mathematics). It is the study of ...
, matrices from finite element or
finite difference A finite difference is a mathematical expression of the form . Finite differences (or the associated difference quotients) are often used as approximations of derivatives, such as in numerical differentiation. The difference operator, commonly d ...
problems are often banded. Such matrices can be viewed as descriptions of the coupling between the problem variables; the banded property corresponds to the fact that variables are not coupled over arbitrarily large distances. Such matrices can be further dividedfor instance, banded matrices exist where every element in the band is nonzero. Problems in higher dimensions also lead to banded matrices, in which case the band itself also tends to be sparse. For instance, a partial differential equation on a square domain (using central differences) will yield a matrix with a bandwidth equal to the
square root In mathematics, a square root of a number is a number such that y^2 = x; in other words, a number whose ''square'' (the result of multiplying the number by itself, or y \cdot y) is . For example, 4 and −4 are square roots of 16 because 4 ...
of the matrix dimension, but inside the band only 5 diagonals are nonzero. Unfortunately, applying
Gaussian elimination In mathematics, Gaussian elimination, also known as row reduction, is an algorithm for solving systems of linear equations. It consists of a sequence of row-wise operations performed on the corresponding matrix of coefficients. This method can a ...
(or equivalently an
LU decomposition In numerical analysis and linear algebra, lower–upper (LU) decomposition or factorization factors a matrix as the product of a lower triangular matrix and an upper triangular matrix (see matrix multiplication and matrix decomposition). The produ ...
) to such a matrix results in the band being filled in by many non-zero elements.


Band storage

Band matrices are usually stored by storing the diagonals in the band; the rest is implicitly zero. For example, a
tridiagonal matrix In linear algebra, a tridiagonal matrix is a band matrix that has nonzero elements only on the main diagonal, the subdiagonal/lower diagonal (the first diagonal below this), and the supradiagonal/upper diagonal (the first diagonal above the main ...
has bandwidth 1. The 6-by-6 matrix : \begin B_ & B_ & 0 & \cdots & \cdots & 0 \\ B_ & B_ & B_ & \ddots & \ddots & \vdots \\ 0 & B_ & B_ & B_ & \ddots & \vdots \\ \vdots & \ddots & B_ & B_ & B_ & 0 \\ \vdots & \ddots & \ddots & B_ & B_ & B_ \\ 0 & \cdots & \cdots & 0 & B_ & B_ \end is stored as the 6-by-3 matrix : \begin 0 & B_ & B_\\ B_ & B_ & B_ \\ B_ & B_ & B_ \\ B_ & B_ & B_ \\ B_ & B_ & B_ \\ B_ & B_ & 0 \end. A further saving is possible when the matrix is symmetric. For example, consider a symmetric 6-by-6 matrix with an upper bandwidth of 2: : \begin A_ & A_ & A_ & 0 & \cdots & 0 \\ & A_ & A_ & A_ & \ddots & \vdots \\ & & A_ & A_ & A_ & 0 \\ & & & A_ & A_ & A_ \\ & sym & & & A_ & A_ \\ & & & & & A_ \end. This matrix is stored as the 6-by-3 matrix: : \begin A_ & A_ & A_ \\ A_ & A_ & A_ \\ A_ & A_ & A_ \\ A_ & A_ & A_ \\ A_ & A_ & 0 \\ A_ & 0 & 0 \end.


Band form of sparse matrices

From a computational point of view, working with band matrices is always preferential to working with similarly dimensioned square matrices. A band matrix can be likened in complexity to a rectangular matrix whose row dimension is equal to the bandwidth of the band matrix. Thus the work involved in performing operations such as multiplication falls significantly, often leading to huge savings in terms of calculation time and
complexity Complexity characterizes the behavior of a system or model whose components interact in multiple ways and follow local rules, leading to non-linearity, randomness, collective dynamics, hierarchy, and emergence. The term is generally used to c ...
. As sparse matrices lend themselves to more efficient computation than dense matrices, as well as in more efficient utilization of computer storage, there has been much research focused on finding ways to minimise the bandwidth (or directly minimise the fill-in) by applying permutations to the matrix, or other such equivalence or similarity transformations. The Cuthill–McKee algorithm can be used to reduce the bandwidth of a sparse
symmetric matrix In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally, Because equal matrices have equal dimensions, only square matrices can be symmetric. The entries of a symmetric matrix are symmetric with ...
. There are, however, matrices for which the reverse Cuthill–McKee algorithm performs better. There are many other methods in use. The problem of finding a representation of a matrix with minimal bandwidth by means of permutations of rows and columns is
NP-hard In computational complexity theory, a computational problem ''H'' is called NP-hard if, for every problem ''L'' which can be solved in non-deterministic polynomial-time, there is a polynomial-time reduction from ''L'' to ''H''. That is, assumi ...
.


See also

*
Diagonal matrix In linear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero; the term usually refers to square matrices. Elements of the main diagonal can either be zero or nonzero. An example of a 2×2 diagon ...
*
Graph bandwidth In graph theory, the graph bandwidth problem is to label the vertices of a graph with distinct integers so that the quantity \max\ is minimized ( is the edge set of ). The problem may be visualized as placing the vertices of a graph at distin ...


Notes


References

* . * . * . * . * .


External links


Information pertaining to LAPACK and band matrices


{{Authority control Sparse matrices