Khatri–Rao product

(Redirected from Face-splitting product)

In mathematics, the Khatri–Rao product or block Kronecker product of two partitioned matrices and is defined as[1][2][3]

in which the ij-th block is the mipi × njqj sized Kronecker product of the corresponding blocks of A and B, assuming the number of row and column partitions of both matrices is equal. The size of the product is then i mipi) × (Σj njqj).

For example, if A and B both are 2 × 2 partitioned matrices e.g.:

we obtain:

This is a submatrix of the Tracy–Singh product [4] of the two matrices (each partition in this example is a partition in a corner of the Tracy–Singh product).

Column-wise Kronecker product

edit

The column-wise Kronecker product of two matrices is a special case of the Khatri-Rao product as defined above, and may also be called the Khatri–Rao product. This product assumes the partitions of the matrices are their columns. In this case m1 = m, p1 = p, n = q and for each j: nj = qj = 1. The resulting product is a mp × n matrix of which each column is the Kronecker product of the corresponding columns of A and B. Using the matrices from the previous examples with the columns partitioned:

 

so that:

 

This column-wise version of the Khatri–Rao product is useful in linear algebra approaches to data analytical processing[5] and in optimizing the solution of inverse problems dealing with a diagonal matrix.[6][7]

In 1996 the column-wise Khatri–Rao product was proposed to estimate the angles of arrival (AOAs) and delays of multipath signals[8] and four coordinates of signals sources[9] at a digital antenna array.

Face-splitting product

edit
 
Face splitting product of matrices

An alternative concept of the matrix product, which uses row-wise splitting of matrices with a given quantity of rows, was proposed by V. Slyusar[10] in 1996.[9][11][12][13][14]

This matrix operation was named the "face-splitting product" of matrices[11][13] or the "transposed Khatri–Rao product". This type of operation is based on row-by-row Kronecker products of two matrices. Using the matrices from the previous examples with the rows partitioned:

 

the result can be obtained:[9][11][13]

 

Main properties

edit
  1. Transpose (V. Slyusar, 1996[9][11][12]):
     ,
  2. Bilinearity and associativity:[9][11][12]
     

    where A, B and C are matrices, and k is a scalar,

     ,[12]
    where   is a vector,
  3. The mixed-product property (V. Slyusar, 1997[12]):
     ,
     ,[13]
     [15]
     ,[16]
    where   denotes the Hadamard product,
  4.  ,[12]
  5.  ,[9] where   is a row vector,
  6.  ,[16]
  7.  , where   is a permutation matrix.[7]
  8.  
     ,[13][15]
    Similarly:
     ,
  9.  
     ,[12]
     ,
    where   and   are vectors,
  10.  ,[17]  ,
  11.  
     ,[18]
    where   and   are vectors (it is a combine of properties 3 an 8), Similarly:
     
  12.  
     ,
    where   is vector convolution;   are "count sketch" matrices; and   is the Fourier transform matrix (this result is an evolving of count sketch properties[19]).

    This can be generalized for appropriate matrices  :
     
    because property 11 above gives us
     
    And the convolution theorem gives us
     
  13.  
     ,[20]
    where   is   matrix,   is   matrix,   is a vector of 1's of length  , and   is a vector of 1's of length   or
     ,[21]
    where   is   matrix,   means element by element multiplication and   is a vector of 1's of length  .
     ,
    where   denotes the penetrating face product of matrices.[13] Similarly:
     , where   is   matrix,   is   matrix,.
  14.  
     ,[12]
     [13]=  =  ,
     ,[21]
    where   is the vector consisting of the diagonal elements of  ,   means stack the columns of a matrix   on top of each other to give a vector.
  15.  
     .[13][15]
    Similarly:
     ,
    where   and   are vectors

Examples

edit

Source:[18]

 

Theorem

edit

Source:[18]

If  , where   are independent components a random matrix   with independent identically distributed rows  , such that

  and  ,

then for any vector  

 

with probability   if the quantity of rows

 

In particular, if the entries of   are   can get

 

which matches the Johnson–Lindenstrauss lemma of   when   is small.

Block face-splitting product

edit
 
Transposed block face-splitting product in the context of a multi-face radar model[15]

According to the definition of V. Slyusar[9][13] the block face-splitting product of two partitioned matrices with a given quantity of rows in blocks

 

can be written as :

 

The transposed block face-splitting product (or Block column-wise version of the Khatri–Rao product) of two partitioned matrices with a given quantity of columns in blocks has a view:[9][13]

 

Main properties

edit
  1. Transpose:
     [15]

Applications

edit

The Face-splitting product and the Block Face-splitting product used in the tensor-matrix theory of digital antenna arrays. These operations are also used in:

See also

edit

Notes

edit
  1. ^ Khatri C. G., C. R. Rao (1968). "Solutions to some functional equations and their applications to characterization of probability distributions". Sankhya. 30: 167–180. Archived from the original (PDF) on 2010-10-23. Retrieved 2008-08-21.
  2. ^ Liu, Shuangzhe (1999). "Matrix Results on the Khatri–Rao and Tracy–Singh Products". Linear Algebra and Its Applications. 289 (1–3): 267–277. doi:10.1016/S0024-3795(98)10209-4.
  3. ^ Zhang X; Yang Z; Cao C. (2002), "Inequalities involving Khatri–Rao products of positive semi-definite matrices", Applied Mathematics E-notes, 2: 117–124
  4. ^ Liu, Shuangzhe; Trenkler, Götz (2008). "Hadamard, Khatri-Rao, Kronecker and other matrix products". International Journal of Information and Systems Sciences. 4 (1): 160–177.
  5. ^ See e.g. H. D. Macedo and J.N. Oliveira. A linear algebra approach to OLAP. Formal Aspects of Computing, 27(2):283–307, 2015.
  6. ^ Lev-Ari, Hanoch (2005-01-01). "Efficient Solution of Linear Matrix Equations with Application to Multistatic Antenna Array Processing" (PDF). Communications in Information & Systems. 05 (1): 123–130. doi:10.4310/CIS.2005.v5.n1.a5. ISSN 1526-7555.
  7. ^ a b Masiero, B.; Nascimento, V. H. (2017-05-01). "Revisiting the Kronecker Array Transform". IEEE Signal Processing Letters. 24 (5): 525–529. Bibcode:2017ISPL...24..525M. doi:10.1109/LSP.2017.2674969. ISSN 1070-9908. S2CID 14166014.
  8. ^ Vanderveen, M. C., Ng, B. C., Papadias, C. B., & Paulraj, A. (n.d.). Joint angle and delay estimation (JADE) for signals in multipath environments. Conference Record of The Thirtieth Asilomar Conference on Signals, Systems and Computers. – DOI:10.1109/acssc.1996.599145
  9. ^ a b c d e f g h Slyusar, V. I. (December 27, 1996). "End matrix products in radar applications" (PDF). Izvestiya VUZ: Radioelektronika. 41 (3): 71–75.
  10. ^ Anna Esteve, Eva Boj & Josep Fortiana (2009): "Interaction Terms in Distance-Based Regression," Communications in Statistics – Theory and Methods, 38:19, p. 3501 [1]
  11. ^ a b c d e Slyusar, V. I. (1997-05-20). "Analytical model of the digital antenna array on a basis of face-splitting matrix products" (PDF). Proc. ICATT-97, Kyiv: 108–109.
  12. ^ a b c d e f g h Slyusar, V. I. (1997-09-15). "New operations of matrices product for applications of radars" (PDF). Proc. Direct and Inverse Problems of Electromagnetic and Acoustic Wave Theory (DIPED-97), Lviv.: 73–74.
  13. ^ a b c d e f g h i j Slyusar, V. I. (March 13, 1998). "A Family of Face Products of Matrices and its Properties" (PDF). Cybernetics and Systems Analysis C/C of Kibernetika I Sistemnyi Analiz. 1999. 35 (3): 379–384. doi:10.1007/BF02733426. S2CID 119661450.
  14. ^ Slyusar, V. I. (2003). "Generalized face-products of matrices in models of digital antenna arrays with nonidentical channels" (PDF). Radioelectronics and Communications Systems. 46 (10): 9–17.
  15. ^ a b c d e Vadym Slyusar. New Matrix Operations for DSP (Lecture). April 1999. – DOI: 10.13140/RG.2.2.31620.76164/1
  16. ^ a b C. Radhakrishna Rao. Estimation of Heteroscedastic Variances in Linear Models.//Journal of the American Statistical Association, Vol. 65, No. 329 (Mar., 1970), pp. 161–172
  17. ^ Kasiviswanathan, Shiva Prasad, et al. «The price of privately releasing contingency tables and the spectra of random matrices with correlated rows.» Proceedings of the forty-second ACM symposium on Theory of computing. 2010.
  18. ^ a b c d Thomas D. Ahle, Jakob Bæk Tejs Knudsen. Almost Optimal Tensor Sketch. Published 2019. Mathematics, Computer Science, ArXiv
  19. ^ Ninh, Pham; Pagh, Rasmus (2013). Fast and scalable polynomial kernels via explicit feature maps. SIGKDD international conference on Knowledge discovery and data mining. Association for Computing Machinery. doi:10.1145/2487575.2487591.
  20. ^ a b Eilers, Paul H.C.; Marx, Brian D. (2003). "Multivariate calibration with temperature interaction using two-dimensional penalized signal regression". Chemometrics and Intelligent Laboratory Systems. 66 (2): 159–174. doi:10.1016/S0169-7439(03)00029-7.
  21. ^ a b c Currie, I. D.; Durban, M.; Eilers, P. H. C. (2006). "Generalized linear array models with applications to multidimensional smoothing". Journal of the Royal Statistical Society. 68 (2): 259–280. doi:10.1111/j.1467-9868.2006.00543.x. S2CID 10261944.
  22. ^ Bryan Bischof. Higher order co-occurrence tensors for hypergraphs via face-splitting. Published 15 February 2020, Mathematics, Computer Science, ArXiv
  23. ^ Johannes W. R. Martini, Jose Crossa, Fernando H. Toledo, Jaime Cuevas. On Hadamard and Kronecker products in covariance structures for genotype x environment interaction.//Plant Genome. 2020;13:e20033. Page 5. [2]

References

edit