Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Band matrix

From Wikipedia, the free encyclopedia
Matrix with non-zero elements only in a diagonal band

Inmathematics, particularlymatrix theory, aband matrix orbanded matrix is asparse matrix whose non-zero entries are confined to a diagonalband, comprising themain diagonal and zero or more diagonals on either side.

Band matrix

[edit]

Bandwidth

[edit]

Formally, consider ann×n matrixA=(ai,j). If all matrix elements are zero outside a diagonally bordered band whose range is determined by constantsk1 andk2:

ai,j=0ifj<ik1 or j>i+k2;k1,k20.{\displaystyle a_{i,j}=0\quad {\mbox{if}}\quad j<i-k_{1}\quad {\mbox{ or }}\quad j>i+k_{2};\quad k_{1},k_{2}\geq 0.\,}

then the quantitiesk1 andk2 are called thelower bandwidth andupper bandwidth, respectively.[1] Thebandwidth of the matrix is the maximum ofk1 andk2; in other words, it is the numberk such thatai,j=0{\displaystyle a_{i,j}=0} if|ij|>k{\displaystyle |i-j|>k}.[2]

Examples

[edit]

Applications

[edit]

Innumerical analysis, matrices fromfinite element orfinite difference problems are often banded. Such matrices can be viewed as descriptions of the coupling between the problem variables; the banded property corresponds to the fact that variables are not coupled over arbitrarily large distances. Such matrices can be further divided – for instance, banded matrices exist where every element in the band is nonzero.

Problems in higher dimensions also lead to banded matrices, in which case the band itself also tends to be sparse. For instance, a partial differential equation on a square domain (using central differences) will yield a matrix with a bandwidth equal to thesquare root of the matrix dimension, but inside the band only 5 diagonals are nonzero. Unfortunately, applyingGaussian elimination (or equivalently anLU decomposition) to such a matrix results in the band being filled in by many non-zero elements.

Band storage

[edit]

Band matrices are usually stored by storing the diagonals in the band; the rest is implicitly zero.

For example, atridiagonal matrix has bandwidth 1. The 6-by-6 matrix

[B11B1200B21B22B230B32B33B34B43B44B450B54B55B5600B65B66]{\displaystyle {\begin{bmatrix}B_{11}&B_{12}&0&\cdots &\cdots &0\\B_{21}&B_{22}&B_{23}&\ddots &\ddots &\vdots \\0&B_{32}&B_{33}&B_{34}&\ddots &\vdots \\\vdots &\ddots &B_{43}&B_{44}&B_{45}&0\\\vdots &\ddots &\ddots &B_{54}&B_{55}&B_{56}\\0&\cdots &\cdots &0&B_{65}&B_{66}\end{bmatrix}}}

is stored as the 6-by-3 matrix

[0B11B12B21B22B23B32B33B34B43B44B45B54B55B56B65B660].{\displaystyle {\begin{bmatrix}0&B_{11}&B_{12}\\B_{21}&B_{22}&B_{23}\\B_{32}&B_{33}&B_{34}\\B_{43}&B_{44}&B_{45}\\B_{54}&B_{55}&B_{56}\\B_{65}&B_{66}&0\end{bmatrix}}.}

A further saving is possible when the matrix is symmetric. For example, consider a symmetric 6-by-6 matrix with an upper bandwidth of 2:

[A11A12A1300A22A23A24A33A34A350A44A45A46symA55A56A66].{\displaystyle {\begin{bmatrix}A_{11}&A_{12}&A_{13}&0&\cdots &0\\&A_{22}&A_{23}&A_{24}&\ddots &\vdots \\&&A_{33}&A_{34}&A_{35}&0\\&&&A_{44}&A_{45}&A_{46}\\&sym&&&A_{55}&A_{56}\\&&&&&A_{66}\end{bmatrix}}.}

This matrix is stored as the 6-by-3 matrix:

[A11A12A13A22A23A24A33A34A35A44A45A46A55A560A6600].{\displaystyle {\begin{bmatrix}A_{11}&A_{12}&A_{13}\\A_{22}&A_{23}&A_{24}\\A_{33}&A_{34}&A_{35}\\A_{44}&A_{45}&A_{46}\\A_{55}&A_{56}&0\\A_{66}&0&0\end{bmatrix}}.}

Band form of sparse matrices

[edit]

From a computational point of view, working with band matrices is always preferential to working with similarly dimensionedsquare matrices. A band matrix can be likened in complexity to a rectangular matrix whose row dimension is equal to the bandwidth of the band matrix. Thus the work involved in performing operations such as multiplication falls significantly, often leading to huge savings in terms of calculation time andcomplexity.

As sparse matrices lend themselves to more efficient computation than dense matrices, as well as in more efficient utilization of computer storage, there has been much research focused on finding ways to minimise the bandwidth (or directly minimise the fill-in) by applying permutations to the matrix, or other such equivalence or similarity transformations.[3]

TheCuthill–McKee algorithm can be used to reduce the bandwidth of a sparsesymmetric matrix. There are, however, matrices for which thereverse Cuthill–McKee algorithm performs better. There are many other methods in use.

The problem of finding a representation of a matrix with minimal bandwidth by means of permutations of rows and columns isNP-hard.[4]

See also

[edit]

Notes

[edit]
  1. ^Golub & Van Loan 1996, §1.2.1.
  2. ^Atkinson 1989, p. 527.
  3. ^Davis 2006, §7.7.
  4. ^Feige 2000.

References

[edit]
  • Atkinson, Kendall E. (1989),An Introduction to Numerical Analysis, John Wiley & Sons,ISBN 0-471-62489-6.
  • Davis, Timothy A. (2006),Direct Methods for Sparse Linear Systems, Society for Industrial and Applied Mathematics,ISBN 978-0-898716-13-9.
  • Feige, Uriel (2000), "Coping with the NP-Hardness of the Graph Bandwidth Problem",Algorithm Theory - SWAT 2000, Lecture Notes in Computer Science, vol. 1851, pp. 129–145,doi:10.1007/3-540-44985-X_2.
  • Golub, Gene H.;Van Loan, Charles F. (1996),Matrix Computations (3rd ed.), Baltimore: Johns Hopkins,ISBN 978-0-8018-5414-9.
  • Press, WH; Teukolsky, SA; Vetterling, WT; Flannery, BP (2007),"Section 2.4",Numerical Recipes: The Art of Scientific Computing (3rd ed.), New York: Cambridge University Press,ISBN 978-0-521-88068-8, archived fromthe original on 2016-03-04, retrieved2011-08-08.

External links

[edit]
Matrix classes
Explicitly constrained entries
Constant
Conditions oneigenvalues or eigenvectors
Satisfying conditions onproducts orinverses
With specific applications
Used instatistics
Used ingraph theory
Used in science and engineering
Related terms
Authority control databasesEdit this at Wikidata
Retrieved from "https://en.wikipedia.org/w/index.php?title=Band_matrix&oldid=1306471914"
Category:
Hidden categories:

[8]ページ先頭

©2009-2025 Movatter.jp