In mathematics, especially in linear algebra and matrix theory, the commutation matrix is used for transforming the vectorized form of a matrix into the vectorized form of its transpose. Specifically, the commutation matrix K(m,n) is the nm × mn matrix which, for any m × n matrix A, transforms vec(A) into vec(AT):

K(m,n) vec(A) = vec(AT) .

Here vec(A) is the mn × 1 column vector obtain by stacking the columns of A on top of one another:

where A = [Ai,j]. In other words, vec(A) is the vector obtained by vectorizing A in column-major order. Similarly, vec(AT) is the vector obtaining by vectorizing A in row-major order.

In the context of quantum information theory, the commutation matrix is sometimes referred to as the swap matrix or swap operator [1]

Properties

edit
  • The commutation matrix is a special type of permutation matrix, and is therefore orthogonal. In particular, K(m,n) is equal to  , where   is the permutation over   for which
 
  • The determinant of K(m,n) is  .
  • Replacing A with AT in the definition of the commutation matrix shows that K(m,n) = (K(n,m))T. Therefore, in the special case of m = n the commutation matrix is an involution and symmetric.
  • The main use of the commutation matrix, and the source of its name, is to commute the Kronecker product: for every m × n matrix A and every r × q matrix B,
 
This property is often used in developing the higher order statistics of Wishart covariance matrices.[2]
  • The case of n=q=1 for the above equation states that for any column vectors v,w of sizes m,r respectively,
 
This property is the reason that this matrix is referred to as the "swap operator" in the context of quantum information theory.
  • Two explicit forms for the commutation matrix are as follows: if er,j denotes the j-th canonical vector of dimension r (i.e. the vector with 1 in the j-th coordinate and 0 elsewhere) then
 
  • The commutation matrix may be expressed as the following block matrix:
 
Where the p,q entry of n x m block-matrix Ki,j is given by
 
For example,
 

Code

edit

For both square and rectangular matrices of m rows and n columns, the commutation matrix can be generated by the code below.

Python

edit
import numpy as np

def comm_mat(m, n):
    # determine permutation applied by K
    w = np.arange(m * n).reshape((m, n), order="F").T.ravel(order="F")

    # apply this permutation to the rows (i.e. to each column) of identity matrix and return result
    return np.eye(m * n)[w, :]

Alternatively, a version without imports:

# Kronecker delta
def delta(i, j):
    return int(i == j)

def comm_mat(m, n):
    # determine permutation applied by K
    v = [m * j + i for i in range(m) for j in range(n)]

    # apply this permutation to the rows (i.e. to each column) of identity matrix
    I = [[delta(i, j) for j in range(m * n)] for i in range(m * n)]
    return [I[i] for i in v]

MATLAB

edit
function P = com_mat(m, n)

% determine permutation applied by K
A = reshape(1:m*n, m, n);
v = reshape(A', 1, []);

% apply this permutation to the rows (i.e. to each column) of identity matrix
P = eye(m*n);
P = P(v,:);
# Sparse matrix version
comm_mat = function(m, n){
  i = 1:(m * n)
  j = NULL
  for (k in 1:m) {
    j = c(j, m * 0:(n-1) + k)
  }
  Matrix::sparseMatrix(
    i = i, j = j, x = 1
  )
}

Example

edit

Let   denote the following   matrix:

 

  has the following column-major and row-major vectorizations (respectively):

 

The associated commutation matrix is

 

(where each   denotes a zero). As expected, the following holds:

 
 

References

edit
  1. ^ Watrous, John (2018). The Theory of Quantum Information. Cambridge University Press. p. 94.
  2. ^ von Rosen, Dietrich (1988). "Moments for the Inverted Wishart Distribution". Scand. J. Stat. 15: 97–109.
  • Jan R. Magnus and Heinz Neudecker (1988), Matrix Differential Calculus with Applications in Statistics and Econometrics, Wiley.