...

10 Chapter 10 Partitioned Matrices

by taratuta

on
Category: Documents
381

views

Report

Comments

Transcript

10 Chapter 10 Partitioned Matrices
10
Partitioned Matrices
Robert Reams
Virginia Commonwealth University
10.1
10.1 Submatrices and Block Matrices . . . . . . . . . . . . . . . . . . . .
10.2 Block Diagonal and Block Triangular Matrices . . . . . .
10.3 Schur Complements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
10.4 Kronecker Products . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
10-1
10-4
10-6
10-8
10-10
Submatrices and Block Matrices
Definitions:
Let A ∈ F m×n . Then the row indices of A are {1, . . . , m}, and the column indices of A are {1, . . . , n}. Let
α, β be nonempty sets of indices with α ⊆ {1, . . . , m} and β ⊆ {1, . . . , n}.
A submatrix A[α, β] is a matrix whose rows have indices α among the row indices of A, and whose
columns have indices β among the column indices of A. A(α, β) = A[α c , β c ], where α c is the complement
of α.
A principal submatrix is a submatrix A[α, α], denoted more compactly as A[α].
Let the set {1, . . . m} be partitioned into the subsets α1 , . . . , αr in the usual sense of partitioning a set
(so that αi ∩ α j = ∅, for all i = j , 1 ≤ i, j ≤ r , and α1 ∪ · · · ∪ αr = {1, . . . , m}), and let {1, . . . , n} be
partitioned into the subsets β1 , . . . , βs .
The matrix A ∈ F m×n is said to be partitioned into the submatrices A[αi , β j ], 1 ≤ i ≤ r , 1 ≤ j ≤ s .
A block matrix is a matrix that is partitioned into submatrices A[αi , β j ] with the row indices {1, . . . , m}
and column indices {1, . . . , n} partitioned into subsets sequentially, i.e., α1 = {1, . . . , i 1 }, α2 = {i 1 +
1, . . . , i 2 }, etc.
Each entry of a block matrix, which is a submatrix A[αi , β j ], is called a block, and we will sometimes
write A = [Aij ] to label the blocks, where Aij = A[αi , β j ].
If the block matrix A ∈ F m× p is partitioned with αi s and β j s, 1 ≤ i ≤ r , 1 ≤ j ≤ s , and the block
matrix B ∈ F p×n is partitioned with β j s and γk s, 1 ≤ j ≤ s , 1 ≤ k ≤ t, then the partitions of A and B
are said to be conformal (or sometimes conformable).
Facts:
The following facts can be found in [HJ85]. This information is also available in many other standard
references such as [LT85] or [Mey00].
1. Two block matrices A = [Aij ] and B = [Bij ] in F m×n , which are both partitioned with the same
αi s and β j s, 1 ≤ i ≤ r , 1 ≤ j ≤ s , may be added block-wise, as with the usual matrix addition, so
that the (i, j ) block entry of A + B is (A + B)ij = Aij + Bij .
10-1
10-2
Handbook of Linear Algebra
2. If the block matrix A ∈ F m× p and the block matrix B ∈ F p×n have conformal partitions, then we
can think of A and B as having entries, which are blocks, so that we can then multiply A and B
block-wise to form the m × n block matrix C = AB. Then C i k = sj =1 Aij Bjk , and the matrix C
will be partitioned with the αi s and γk s, 1 ≤ i ≤ r , 1 ≤ k ≤ t, where A is partitioned with αi s and
β j s, 1 ≤ i ≤ r , 1 ≤ j ≤ s , and B is partitioned with β j s and γk s, 1 ≤ j ≤ s , 1 ≤ k ≤ t.
3. With addition and multiplication of block matrices described as in Facts 1 and 2 the usual properties
of associativity of addition and multiplication of block matrices hold, as does distributivity, and
commutativity of addition. The additive identity 0 and multiplicative identity I are the same
under block addition and multiplication, as with the usual matrix addition and multiplication.
The additive identity 0 has zero matrices as blocks; the multiplicative identity I has multiplicative
identity submatrices as diagonal blocks and zero matrices as off-diagonal blocks.
4. If the partitions of A and B are conformal, the partitions of B and A are not necessarily conformal,
even if B A is defined.
A11 A12
n×n
5. Let A ∈ F
be a block matrix of the form A =
, where A12 is k × k, and A21 is
A21
0
(n − k) × (n − k). Then det( A) = (−1)(n+1)k det(A12 )det(A21 ).
Examples:
1. Let the block matrix A ∈ C
n×n
given by A =
A11
A21
A12
be Hermitian. Then A11 and A22 are
A22
Hermitian, and A21 = A∗12 .
2. If A = [aij ], then A[{i }, { j }] is the 1 × 1 matrix whose entry is aij . The submatrix A({i }, { j }) is the
submatrix of A obtained by deleting row i and column j of A.
⎡
⎤
1 −2
⎢
Let A = ⎣−3 0
2
7
5 3 −1
⎥
1 6 1 ⎦.
4 5 −7
3. Then A[{2}, {3}] = [a23 ] = [1] and A({2}, {3}) =
1 −2
2 7
3 −1
.
5 −7
1 −2
4. Let α = {1, 3} and β = {1, 2, 4}. Then the submatrix A[α, β] =
2 7
1
submatrix A[α] =
2
3
, and the principal
5
5
.
4
5. Let α1 = {1}, α2 = {2, 3} and let β1 = {1, 2}, β2 = {3}, β3 = {4, 5}. Then the block matrix, with
(i, j ) block entry Aij = A[αi , β j ], 1 ≤ i ≤ 2, 1 ≤ j ≤ 3, is
A=
A11
A21
6. Let B =
A12
A22
B11
B21
A13
A23
B12
B22
⎡
⎤
1
−2 |
5
|
3
−1
⎢
⎥
⎢− − − − − | − − | − − − − −⎥
=⎢
⎥.
0
|
1
|
6
1 ⎦
⎣ −3
2
7
|
4
|
5
−7
B13
B23
⎡
⎤
2
−1 |
0
|
6
−2
⎢
⎥
⎢− − − − − | − − | − − − − −⎥
=⎢
⎥. Then the matrices A
0
|
5
|
3
7 ⎦
⎣ −4
1
1
| −2 |
2
6
(of this example) and B are partitioned with the same αi s and β j s, so they can be added block-wise
10-3
Partitioned Matrices
A11
as
A21
A12
A22
A13
B
+ 11
A23
B21
B12
B22
B13
B23
A11 + B11
=
A21 + B21
A12 + B12
A22 + B22
A13 + B13
A23 + B23
⎡
⎤
3
−3 |
5
|
9
−3
⎢− − − − − | − − | − − − − −⎥
⎢
⎥
=⎢
⎥.
0
|
6
|
9
8 ⎦
⎣ −7
3
8
|
2
|
7
−1
7. The block matrices A =
⎡
A11
A21
⎤
B11
A13
⎢ ⎥
and B = ⎣ B21⎦ have conformal partitions if the β j
A13
B31
A12
A22
index sets, which form the submatrices Aij = A[αi , β j ] of A, are the same as the β j index sets,
which form the submatrices Bjk = B[β j , γk ] of B. For instance, the matrix
A=
A11
A21
A12
A22
A13
A23
⎡
⎤
1
−2 |
5
|
3
−1
⎢
−
−
−
−
−
|
−
−
|
−
−
−
−
−⎥
⎢
⎥
=⎢
⎥
0
|
1
|
6
1 ⎦
⎣ −3
2
7
|
4
|
5
−7
⎡
⎤
−1
9 ⎥
⎥
−−⎥
⎥
⎥
2 ⎥ have conformal partitions, so A and B can be
⎥
−−⎥
⎥
−8 ⎦
−1
4
⎢ 3
⎢
⎡ ⎤
⎢−−
B11
⎢
⎢ ⎥
⎢
and the matrix B = ⎣ B21⎦ = ⎢ 5
⎢
⎢−−
B31
⎢
⎣ 2
7
multiplied block-wise to form the 3 × 2 matrix
A11 B11 + A12 B21 + A13 B31
AB =
A21 B11 + A22 B21 + A23 B31
⎡
2] + [3
4 −1
1
+
[5
3 9
4
2] +
−2]
=⎢
⎢
⎣ −3
2
0
7
4 −1
+ 5[5
3 9
[1
⎢
⎢
⎡
[−2 −19] + [25 10] + [−1
⎢
= ⎣ −12 3
5 2
19
29
8. Let A =
1 2
4 5
61
+
⎡
20
8
7
|
⎢
| 3
|
⎢ 9
and B = ⎢
| 6
⎣−−
1
|
+
−1]
−39
6
5
2 −8
7 −1
1
−7
2
7
⎤ ⎡
−23]
⎥ ⎢22
−49 ⎦ = ⎣12
−33
⎤
10
⎤
⎥
⎥
⎥
⎥
−8 ⎦
−1
⎤
−32
⎥
−44⎦ .
36
8
0 ⎥
⎥
⎥. Then A and B have conformal partitions. BA
−−⎦
2
is defined, but B and A do not have conformal partitions.
10-4
10.2
Handbook of Linear Algebra
Block Diagonal and Block Triangular Matrices
Definitions:
A matrix A = [aij ] ∈ F n×n is diagonal if aij = 0, for all i = j , 1 ≤ i, j ≤ n.
A diagonal matrix A = [aij ] ∈ F n×n is said to be scalar if aii = a, for all i , 1 ≤ i ≤ n, and some scalar
a ∈ F , i.e., A = a In .
A matrix A ∈ F n×n is block diagonal if A as a block matrix is partitioned into submatrices Aij ∈
ni ×n j
, so that A = [Aij ], ik=1 ni = n, and Aij = 0, for all i = j , 1 ≤ i, j ≤ k. Thus, A =
F
⎡
⎤
A11
0 ···
0
⎢ 0
0 ⎥
A22 · · ·
⎢
⎥
⎢ .
.. ⎥
..
⎢ .
⎥. This block diagonal matrix A is denoted A = diag(A11 , . . . , Akk ), where Aii is
.
⎣ .
. ⎦
0
0 · · · Akk
ni × ni , (or sometimes denoted A = A11 ⊕ · · · ⊕ Akk , and called the direct sum of A11 , . . . , Akk ).
A matrix A = [aij ] ∈ F n×n is upper triangular if aij = 0, for all i > j 1 ≤ i, j ≤ n.
An upper triangular matrix A = [aij ] ∈ F n×n is strictly upper triangular if aij = 0 for all i ≥ j ,
1 ≤ i, j ≤ n.
A matrix A ∈ F n×n is lower triangular if aij = 0 for all i < j , 1 ≤ i, j ≤ n, i.e., if AT is upper
triangular.
A matrix A ∈ F n×n is strictly lower triangular if AT is strictly upper triangular.
A matrix is triangular it is upper or lower triangular.
A matrix A ∈ F n×n is block upper triangular, if A as a block matrix is partitioned into the submatrices
Aij ∈ F ni ×n j , so that A = [Aij ], ik=1 ni = n, and Aij = 0, for all i > j , 1 ≤ i, j ≤ k, i.e., considering
⎡
⎤
A11 A12 · · · A1k
⎢ 0
A22 · · · A2k ⎥
⎢
⎥
the Aij blocks as the entries of A, A is upper triangular. Thus, A = ⎢
.. ⎥
..
⎢ ..
⎥, where each
.
⎣ .
. ⎦
0
0 · · · Akk
Aij is ni × n j , and ik=1 ni = n. The matrix A is strictly block upper triangular if Aij = 0, for all i ≥ j ,
1 ≤ i, j ≤ k.
A matrix A ∈ F n×n is block lower triangular if AT is block upper triangular.
A matrix A ∈ F n×n is strictly block lower triangular if AT is strictly block upper triangular.
A matrix A = [aij ] ∈ F n×n is upper Hessenberg if aij = 0, for all i − 2 ≥ j , 1 ≤ i, j ≤ n, i.e., A has
⎡
a11
⎢
⎢a 21
⎢
⎢
⎢0
⎢
the form A = ⎢ .
⎢ .
⎢ .
⎢
⎢0
⎣
0
⎤
a12
a13
···
a1n−1
a1n
a22
a23
···
a2n−1
a32
..
.
a33
···
a2n ⎥
⎥
a3n−1
..
..
0
···
an−1n−2
an−1n−1
···
0
ann−1
0
A matrix A = [aij ] ∈ F
.
n×n
.
⎥
⎥
⎥
⎥
⎥.
⎥
⎥
⎥
an−1n⎥
⎦
a3n
..
.
ann
T
is lower Hessenberg if A is upper Hessenberg.
Facts:
The following facts can be found in [HJ85]. This information is also available in many other standard
references such as [LT85] or [Mey00].
10-5
Partitioned Matrices
1. Let D, D ∈ F n×n be any diagonal matrices. Then D + D and D D are diagonal, and D D = D D.
If D = diag(d1 , . . . , dn ) is nonsingular, then D −1 = diag(1/d1 , . . . , 1/dn ).
2. Let D ∈ F n×n be a matrix such that D A = AD for all A ∈ F n×n . Then D is a scalar matrix.
3. If A ∈ F n×n is a block diagonal matrix, so that A = diag(A11 , . . . , Akk ), then tr(A) = ik=1 tr(Aii ),
det(A) = ik=1 det(Aii ), rank(A) = ik=1 rank(Aii ), and σ (A) = σ (A11 ) ∪ · · · ∪ σ (Akk ).
4. Let A ∈ F n×n be a block diagonal matrix, so that A = diag(A11 , A22 . . . , Akk ). Then A is nonsingu−1
−1
lar if and only if Aii is nonsingular for each i , 1 ≤ i ≤ k. Moreover, A−1 = diag(A−1
11 , A22 . . . , Akk ).
5. See Chapter 4.3 for information on diagonalizability of matrices.
6. Let A ∈ F n×n be a block diagonal matrix, so that A = diag(A11 , . . . , Akk ). Then A is diagonalizable
if and only if Aii is diagonalizable for each i , 1 ≤ i ≤ k.
7. If A, B ∈ F n×n are upper (lower) triangular matrices, then A + B and AB are upper (lower)
triangular. If the upper (lower) triangular matrix A = [aij ] is nonsingular, then A−1 is upper
(lower) triangular with diagonal entries 1/a11 , . . . , 1/ann .
⎡
8. Let A ∈ F n×n
⎢
⎢
be block upper triangular, so that A = ⎢
⎢
⎣
k
ik=1 det(Aii ),
A11
0
..
.
0
k
A12
A22
0
···
···
..
.
···
⎤
A1k
A2k ⎥
⎥
.. ⎥
⎥. Then tr(A) =
. ⎦
Akk
rank(A) ≥
i =1 tr(Aii ), det(A) =
i =1 rank(Aii ), and σ (A) = σ (A11 )
∪ · · · ∪ σ (Akk ).
9. Let A = (Aij ) ∈ F n×n be a block triangular matrix (either upper or lower triangular). Then A is
nonsingular if and only if Aii is nonsingular for each i , 1 ≤ i ≤ k. Moreover, the ni × ni diagonal
block entries of A−1 are Aii−1 , for each i , 1 ≤ i ≤ k.
10. Schur’s Triangularization Theorem: Let A ∈ Cn×n . Then there is a unitary matrix U ∈ Cn×n so that
U ∗ AU is upper triangular. The diagonal entries of U ∗ AU are the eigenvalues of A.
11. Let A ∈ Rn×n . Then there is an orthogonal matrix V ∈ Rn×n so that V T AV is of upper Hessenberg
⎡
A11
⎢ 0
⎢
form ⎢
⎢ ..
⎣ .
0
A12
A22
0
···
···
..
.
···
⎤
A1k
A2k ⎥
⎥
.. ⎥
⎥, where each Aii , 1 ≤ i ≤ k, is 1 × 1 or 2 × 2. Moreover, when Aii is
. ⎦
Akk
1 × 1, the entry of Aii is an eigenvalue of A, whereas when Aii is 2 × 2, then Aii has two eigenvalues
which are nonreal complex conjugates of each other, and are eigenvalues of A.
12. (For more information on unitary triangularization, see Chapter 7.2.)
13. Let A = [Aij ] ∈ F n×n with |σ (A)| = n, where λ1 , . . . , λk ∈ σ (A) are the distinct eigenvalues of
A. Then there is a nonsingular matrix T ∈ F n×n so that T −1 AT = diag(A11 , . . . , Akk ), where each
A ∈ F ni ×ni is upper triangular with all diagonal entries of Aii equal to λi , for 1 ≤ i ≤ k (and
iik
i =1 ni = n).
14. Let A ∈ F n×n be a block upper triangular matrix, of the form A =
A11
0
A12
, where Aij is
A22
ni × n j , 1 ≤ i, j ≤ 2, and i2=1 ni = n. (Note that any block upper triangular matrix can be
said to have this form.) Let x be an eigenvector of A11 , with corresponding eigenvalue λ, so that
A11 x = λx, where x is a (column) vector with n1 components. Then the (column) vector with n
components
x
is an eigenvector of A with eigenvalue λ. Let y be a left eigenvector of A22 , with
0
corresponding eigenvalue µ, so that yA22 = yµ, where y is a row vector with n2 components. Then
the (row) vector with n components [0 y] is a left eigenvector of A with eigenvalue µ.
10-6
Handbook of Linear Algebra
Examples:
⎡
1
⎢2
⎢
⎢
1. The matrix A = ⎢0
⎢
⎣0
0
3
i =1
⎡
⎢a
⎢
⎢
⎢
⎢0
⎢
⎢
⎣
−
⎤
b
d
0
0
⎡
1
⎢2
⎢
⎢
3. The matrix B = ⎢0
⎢
⎣0
0
with B11
1
=
2
3
4
0
0
0
0
5
0
0
0
i =1
rank(Aii ).
⎤
0
0
6
0
0
0
0⎥
⎥
⎥
0⎥ is not block diagonal. However, B is block upper triangular,
⎥
7⎦
8
3
0
, B22 = 0, B33 =
4
0
Notice that 4 = rank(B) ≥
⎡
1
⎢
⎢3
4. The 4 × 4 matrix ⎢
⎣6
10
10.3
3
c
⎥
e ⎦ ∈ F 3×3 , an upper triangular matrix. If a, d, f are nonzero, then A−1 =
f
be − c d ⎤
ad f ⎥
⎥
e ⎥
⎥
−
⎥.
df ⎥
⎥
1 ⎦
f
b
ad
1
d
0
0
0
0
6
7
tr(Aii ), det(A) = (−2)(5)(−2) = i3=1 det(Aii ), and rank(A) = 5 =
a
⎢
2. Let A = ⎣0
0
⎡1
0
0
5
0
0
⎤
0
0⎥
⎥
⎥
0⎥ is a block diagonal matrix, and the trace, determinant, and
⎥
8⎦
9
1 3
6 8
rank can be calculated block-wise, where A11 =
, A22 = 5, and A33 =
, as tr(A) =
2 4
7 9
25 =
3
4
0
0
0
2
4
7
11
3
i =1
0
5
8
12
7
0
0
, B12 =
, B13 =
8
5
0
0
, and B23 = [6
0
0].
rank(Bii ) = 2 + 0 + 1 = 3.
⎤
0
0⎥
⎥
⎥ is not lower triangular, but is lower Hessenberg.
9⎦
13
Schur Complements
In this subsection, the square matrix A =
nonsingular.
A11
A21
A12
is partitioned as a block matrix, where A11 is
A22
Definitions:
The Schur complement of A11 in A is the matrix A22 − A21 A−1
11 A12 , sometimes denoted A/A11 .
Facts:
1. [Zha99]
I
−A21 A−1
11
2. [Zha99] Let A =
0
I
A11
A21
A11
A21
A12
A22
I
0
−A−1
A
11 A12
= 11
I
0
0
.
A/A11
A12
, where A11 is nonsingular. Then det(A) = det(A11 )det(A/A11 ).
A22
Also, rank(A) = rank(A11 ) + rank(A/A11 ).
10-7
Partitioned Matrices
A11 A12
3. [Zha99] Let A =
. Then A is nonsingular if and only if both A11 and the Schur
A21 A22
complement of A11 in A are nonsingular.
4. [HJ85] Let A =
−1
A
A11
A21
A12
, where A11 , A22 , A/A11 , A/A22 , and A are nonsingular. Then
A22
−1
(A/A22 )−1
−A−1
11 A12 (A/A11 )
=
.
−1
−A−1
(A/A11 )−1
22 A21 (A/A22 )
A11
5. [Zha99] Let A =
A21
A12
, where A11 , A22 , A/A11 , and A/A22 are nonsingular. An equation reA22
−1
−1
lating the Schur complements of A11 in A and A22 in A is (A/A22 )−1 = A−1
11 + A11 A12 (A/A11 ) A21
−1
A11 .
A11 A12
6. [LT85] Let A =
, where the k × k matrix A11 is nonsingular. Then rank(A) = k if and
A21 A22
only if A/A11 = 0.
A11 A12
7. [Hay68] Let A =
be Hermitian, where A11 is nonsingular. Then the inertia of A is
A∗12 A22
in(A) = in(A11 ) + in(A/A
11 ). A11 A12
8. [Hay68] Let A =
be Hermitian, where A11 is nonsingular. Then A is positive
A∗12 A22
(semi)definite if and only if both A11 and A/A11 are positive (semi)definite.
Examples:
⎡
1
⎢
1. Let A = ⎣4
7
⎡
1
⎢ 4
⎣
−
⎤
2
5
8
3
⎥
6 ⎦. Then with A11 = 1, we have
10
⎤⎡
0
1
⎥⎢
⎦⎣4
I2
7
7
⎤
⎡
3 ⎥1
6⎦
0
10
2
5
8
1
−[2 3]
⎢
=⎣
0
I2
5
8
⎡
1
⎢
=⎣
0
⎡
1
⎢
2. Let A = ⎣4
7
1 − [2
5
3]
8
−1
A
⎤
⎥
−6 ⎦ .
4
= − 32 , and
7
⎡
⎢
⎢
⎢
=⎢ ⎢
5
⎣
−
⎡
8
− 23
⎢⎡
⎢
⎣⎣
= ⎢ − 23
1
− 13
4
(− 32 )−1
7
1
[−4
3
⎤
⎦
−[2
−1 6
10
−11
6
−1⎤
−1
− 32
3]
⎤
⎡
−3 −6
3]
−6 −11
−1
−3 −6
−6 −11
− 23
⎥ ⎢
⎥ ⎢ 2
= ⎢−
6 ⎥
⎦ ⎣ 3
−3
6
−3 −6
, then A/A11 =
, A/A22 =
−6 −11
10
−1 6
10
⎥
⎦
3]
−3
−6 −11
⎤
3
5
⎥
6 ⎦. With A11 = 1, and A22 =
8
10
2
5
8
6
4 −1
−
1 [2
10
7
0
⎤
0 1
− 43
11
3
−2
1
⎤
⎥
⎥
⎦
−2⎥ .
1
⎥
⎥
⎥
⎥
⎥
⎦
10-8
Handbook of Linear Algebra
⎡
1
⎢
3. Let A = ⎣4
7
2
5
8
⎤
3
⎥
6 ⎦.
10
Then, from Fact 5,
−1
(A/A22 )
3
= −
2
−1
−1
=1
−1
+ 1 [2
−1 −3 −6
3]
−6 −11
4 −1
1 ,
7
−1
−1
−1
= A−1
11 + A11 A12 (A/A11 ) A21 A11 .
10.4
Kronecker Products
Definitions:
Let A ∈ F m×n and B ∈ F p×q . Then the Kronecker product (sometimes called the tensor product) of A
⎡
a11 B
⎢a B
⎢ 21
and B, denoted A ⊗ B, is the mp × nq partitioned matrix A ⊗ B = ⎢
⎢ ..
⎣ .
an1 B
a12 B
a22 B
..
.
an2 B
···
···
..
.
···
⎤
a1n B
a2n B ⎥
⎥
.. ⎥
⎥.
. ⎦
ann B
Let A ∈ F m×n and let the j th column of A, namely, A[{1, . . . , m}, { j }] be denoted a j , for 1 ≤ j ≤ n.
⎡ ⎤
a1
⎢a ⎥
⎢ 2⎥
mn
⎥
The column vector with mn components, denoted vec(A), defined by vec(A) = ⎢
⎢ .. ⎥ ∈ F , is the
⎣.⎦
an
vec-function of A, i.e., vec(A) is formed by stacking the columns of A on top of each other in their natural
order.
Facts:
All of the following facts except those for which a specific reference is given can be found in [LT85].
Let A ∈ F m×n and B ∈ F p×q . If a ∈ F , then a(A ⊗ B) = (a A) ⊗ B = A ⊗ (a B).
Let A, B ∈ F m×n and C ∈ F p×q . Then (A + B) ⊗ C = A ⊗ C + B ⊗ C .
Let A ∈ F m×n and B, C ∈ F p×q . Then A ⊗ (B + C ) = A ⊗ B + A ⊗ C .
Let A ∈ F m×n , B ∈ F p×q , and C ∈ F r ×s . Then A ⊗ (B ⊗ C ) = (A ⊗ B) ⊗ C .
Let A ∈ F m×n and B ∈ F p×q . Then (A ⊗ B)T = AT ⊗ B T .
[MM64] Let A ∈ Cm×n and B ∈ C p×q . Then (A ⊗ B) = A ⊗ B.
[MM64] Let A ∈ Cm×n and B ∈ C p×q . Then (A ⊗ B)∗ = A∗ ⊗ B ∗ .
Let A ∈ F m×n , B ∈ F p×q , C ∈ F n×r , and D ∈ F q ×s . Then (A ⊗ B)(C ⊗ D) = AC ⊗ B D.
Let A ∈ F m×n and B ∈ F p×q . Then A ⊗ B = (A ⊗ I p )(In ⊗ B) = (Im ⊗ B)(A ⊗ Iq ).
If A ∈ F m×m and B ∈ F n×n are nonsingular, then A⊗B is nonsingular and (A⊗B)−1 = A−1 ⊗B −1 .
Let A1 , A2 , · · · , Ak ∈ F m×m , and B1 , B2 , · · · , Bk ∈ F n×n . Then (A1 ⊗B1 )(A2 ⊗B2 ) · · · (Ak ⊗Bk ) =
(A1 A2 · · · Ak ) ⊗ (B1 B2 · · · Bk ).
12. Let A ∈ F m×m and B ∈ F n×n . Then (A ⊗ B)k = Ak ⊗ B k .
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
10-9
Partitioned Matrices
13. If A ∈ F m×m and B ∈ F n×n , then there is an mn×mn permutation matrix P so that P T (A⊗ B)P =
B ⊗ A.
14. Let A, B ∈ F m×n . Then vec(a A + b B) = a vec(A) + b vec(B), for any a, b ∈ F .
15. If A ∈ F m×n , B ∈ F p×q , and X ∈ F n× p , then vec(AX B) = (B T ⊗ A)vec(X).
16. If A ∈ F m×m and B ∈ F n×n , then det(A⊗ B) = (det(A))n (det(B))m , tr(A⊗ B) = (tr(A))(tr(B)),
and rank(A ⊗ B) = (rank(A))(rank(B)).
17. Let A ∈ F m×m and B ∈ F n×n , with σ (A) = {λ1 , . . . , λm } and σ (B) = {µ1 , . . . , µn }. Then
A ⊗ B ∈ F mn×mn has eigenvalues {λs µt |1 ≤ s ≤ m, 1 ≤ t ≤ n}. Moreover, if the right eigenvectors
of A are denoted xi , and the right eigenvectors of B are denoted y j , so that Axi = λi xi and
By j = µ j y j , then (A ⊗ B)(xi ⊗ y j ) = λi µ j (xi ⊗ y j ).
18. Let A ∈ F m×m and B ∈ F n×n , with σ (A) = {λ1 , . . . , λm } and σ (B) = {µ1 , . . . , µn }. Then
(In ⊗ A) + (B ⊗ Im ) has eigenvalues {λs + µt |1 ≤ s ≤ m, 1 ≤ t ≤ n}.
19. Let p(x, y) ∈ F [x, y] so that p(x, y) = i,k j =1 aij x i y j , where aij ∈ F , 1 ≤ i ≤ k, 1 ≤ j ≤ k. Let
A ∈ F m×m and B ∈ F n×n . Define p(A; B) to be the mn × mn matrix p(A; B) = i,k j =1 aij (Ai ⊗
j
B ). If σ (A) = {λ1 , . . . , λm } and σ (B) = {µ1 , . . . , µn }, then σ ( p(A; B)) = { p(λs , µt )|1 ≤ s ≤
m, 1 ≤ t ≤ n}.
20. Let A1 , A2 ∈ F m×m , B1 , B2 ∈ F n×n . If A1 and A2 are similar, and B1 and B2 are similar, then
A1 ⊗ B1 is similar to A2 ⊗ B2 .
21. If A ∈ F m×n , B ∈ F p×q , and X ∈ F n× p , then vec(AX) = (I p ⊗ A)vec(X), vec(X B) =
(B T ⊗ In )vec(X), and vec(AX + X B) = [(I p ⊗ A) + (B T ⊗ In )]vec(X).
22. If A ∈ F m×n , B ∈ F p×q , C ∈ F m×q , and X ∈ F n× p , then the equation AX B = C can be written
in the form (B T ⊗ A)vec(X) = vec(C ).
23. Let A ∈ F m×m , B ∈ F n×n , C ∈ F m×n , and X ∈ F m×n . Then the equation AX + X B = C can be
written in the form [(In ⊗ A) + (B T ⊗ Im )]vec(X) = vec(C ).
24. Let A ∈ Cm×m and B ∈ Cn×n be Hermitian. Then A ⊗ B is Hermitian.
25. Let A ∈ Cm×m and B ∈ Cn×n be positive definite. Then A ⊗ B is positive definite.
Examples:
⎡
⎡
1
1 −1
⎢
and B = ⎣4
1. Let A =
0 2
7
2
5
8
⎡
1
⎢4
⎤
⎢
3
⎢7
⎥
⎢
6⎦. Then A ⊗ B = ⎢
⎢0
⎢
9
⎣0
0
⎤
2
5
8
0
0
0
⎤
3 −1 −2 −3
6 −4 −5 −6⎥
⎥
9 −7 −8 −9⎥
⎥
⎥.
0 2
4
6⎥
⎥
0 8
10 12 ⎦
0 14 16 18
1
⎢ ⎥
1 −1
⎢ 0⎥
2. Let A =
. Then vec(A) = ⎢ ⎥.
0 2
⎣−1⎦
2
3. Let A ∈ F m×m and B ∈ F n×n . If A is upper (lower) triangular, then A ⊗ B is block upper (lower)
triangular. If A is diagonal then A ⊗ B is block diagonal. If both A and B are upper (lower)
triangular, then A ⊗ B is (upper) triangular. If both A and B are diagonal, then A ⊗ B is diagonal.
4. Let A ∈ F m×n and B ∈ F p×q . If A ⊗ B = 0, then A = 0 or B = 0.
⎡
a11 In
⎢a I
⎢ 21 n
5. Let A ∈ F m×n . Then A ⊗ In = ⎢
⎢ ..
⎣ .
am1 In
a12 In
a22 In
..
.
an2 In
···
···
..
.
···
⎤
a1n In
a2n In ⎥
⎥
mn×n2
. Let B ∈ F p× p . Then
.. ⎥
⎥∈ F
. ⎦
amn In
In ⊗ B = diag(B, B, . . . , B) ∈ F np×np , and Im ⊗ In = Imn .
10-10
Handbook of Linear Algebra
References
[Hay68] E. Haynsworth, Determination of the Inertia of a Partitioned Matrix, Lin. Alg. Appl., 1:73–81
(1968).
[HJ85] R. A. Horn and C. R. Johnson, Matrix Analysis, Cambridge University Press, Combridge, 1985.
[LT85] P. Lancaster and M. Tismenetsky, The Theory of Matrices, with Applications, 2nd ed., Academic
Press, San Diego, 1985.
[MM64] M. Marcus and H. Minc, A Survey of Matrix Theory and Matrix Inequalities, Prindle, Weber, &
Schmidt, Boston, 1964.
[Mey00] C. Meyer, Matrix Analysis and Applied Linear Algebra, SIAM, 2000.
[Zha99] F. Zhang, Matrix Theory, Springer-Verlag, New York, 1999.
Fly UP