Menu
Home Explore People Places Arts History Plants & Animals Science Life & Culture Technology
On this page
Cross-correlation matrix

The cross-correlation matrix of two random vectors is a matrix containing as elements the cross-correlations of all pairs of elements of the random vectors. The cross-correlation matrix is used in various digital signal processing algorithms.

We don't have any images related to Cross-correlation matrix yet.
We don't have any YouTube videos related to Cross-correlation matrix yet.
We don't have any PDF documents related to Cross-correlation matrix yet.
We don't have any Books related to Cross-correlation matrix yet.
We don't have any archived web articles related to Cross-correlation matrix yet.

Definition

For two random vectors X = ( X 1 , … , X m ) T {\displaystyle \mathbf {X} =(X_{1},\ldots ,X_{m})^{\rm {T}}} and Y = ( Y 1 , … , Y n ) T {\displaystyle \mathbf {Y} =(Y_{1},\ldots ,Y_{n})^{\rm {T}}} , each containing random elements whose expected value and variance exist, the cross-correlation matrix of X {\displaystyle \mathbf {X} } and Y {\displaystyle \mathbf {Y} } is defined by1: p.337 

R X Y ≜   E ⁡ [ X Y T ] {\displaystyle \operatorname {R} _{\mathbf {X} \mathbf {Y} }\triangleq \ \operatorname {E} [\mathbf {X} \mathbf {Y} ^{\rm {T}}]}

and has dimensions m × n {\displaystyle m\times n} . Written component-wise:

R X Y = [ E ⁡ [ X 1 Y 1 ] E ⁡ [ X 1 Y 2 ] ⋯ E ⁡ [ X 1 Y n ] E ⁡ [ X 2 Y 1 ] E ⁡ [ X 2 Y 2 ] ⋯ E ⁡ [ X 2 Y n ] ⋮ ⋮ ⋱ ⋮ E ⁡ [ X m Y 1 ] E ⁡ [ X m Y 2 ] ⋯ E ⁡ [ X m Y n ] ] {\displaystyle \operatorname {R} _{\mathbf {X} \mathbf {Y} }={\begin{bmatrix}\operatorname {E} [X_{1}Y_{1}]&\operatorname {E} [X_{1}Y_{2}]&\cdots &\operatorname {E} [X_{1}Y_{n}]\\\\\operatorname {E} [X_{2}Y_{1}]&\operatorname {E} [X_{2}Y_{2}]&\cdots &\operatorname {E} [X_{2}Y_{n}]\\\\\vdots &\vdots &\ddots &\vdots \\\\\operatorname {E} [X_{m}Y_{1}]&\operatorname {E} [X_{m}Y_{2}]&\cdots &\operatorname {E} [X_{m}Y_{n}]\\\\\end{bmatrix}}}

The random vectors X {\displaystyle \mathbf {X} } and Y {\displaystyle \mathbf {Y} } need not have the same dimension, and either might be a scalar value.

Example

For example, if X = ( X 1 , X 2 , X 3 ) T {\displaystyle \mathbf {X} =\left(X_{1},X_{2},X_{3}\right)^{\rm {T}}} and Y = ( Y 1 , Y 2 ) T {\displaystyle \mathbf {Y} =\left(Y_{1},Y_{2}\right)^{\rm {T}}} are random vectors, then R X Y {\displaystyle \operatorname {R} _{\mathbf {X} \mathbf {Y} }} is a 3 × 2 {\displaystyle 3\times 2} matrix whose ( i , j ) {\displaystyle (i,j)} -th entry is E ⁡ [ X i Y j ] {\displaystyle \operatorname {E} [X_{i}Y_{j}]} .

Complex random vectors

If Z = ( Z 1 , … , Z m ) T {\displaystyle \mathbf {Z} =(Z_{1},\ldots ,Z_{m})^{\rm {T}}} and W = ( W 1 , … , W n ) T {\displaystyle \mathbf {W} =(W_{1},\ldots ,W_{n})^{\rm {T}}} are complex random vectors, each containing random variables whose expected value and variance exist, the cross-correlation matrix of Z {\displaystyle \mathbf {Z} } and W {\displaystyle \mathbf {W} } is defined by

R Z W ≜   E ⁡ [ Z W H ] {\displaystyle \operatorname {R} _{\mathbf {Z} \mathbf {W} }\triangleq \ \operatorname {E} [\mathbf {Z} \mathbf {W} ^{\rm {H}}]}

where H {\displaystyle {}^{\rm {H}}} denotes Hermitian transposition.

Uncorrelatedness

Two random vectors X = ( X 1 , … , X m ) T {\displaystyle \mathbf {X} =(X_{1},\ldots ,X_{m})^{\rm {T}}} and Y = ( Y 1 , … , Y n ) T {\displaystyle \mathbf {Y} =(Y_{1},\ldots ,Y_{n})^{\rm {T}}} are called uncorrelated if

E ⁡ [ X Y T ] = E ⁡ [ X ] E ⁡ [ Y ] T . {\displaystyle \operatorname {E} [\mathbf {X} \mathbf {Y} ^{\rm {T}}]=\operatorname {E} [\mathbf {X} ]\operatorname {E} [\mathbf {Y} ]^{\rm {T}}.}

They are uncorrelated if and only if their cross-covariance matrix K X Y {\displaystyle \operatorname {K} _{\mathbf {X} \mathbf {Y} }} matrix is zero.

In the case of two complex random vectors Z {\displaystyle \mathbf {Z} } and W {\displaystyle \mathbf {W} } they are called uncorrelated if

E ⁡ [ Z W H ] = E ⁡ [ Z ] E ⁡ [ W ] H {\displaystyle \operatorname {E} [\mathbf {Z} \mathbf {W} ^{\rm {H}}]=\operatorname {E} [\mathbf {Z} ]\operatorname {E} [\mathbf {W} ]^{\rm {H}}}

and

E ⁡ [ Z W T ] = E ⁡ [ Z ] E ⁡ [ W ] T . {\displaystyle \operatorname {E} [\mathbf {Z} \mathbf {W} ^{\rm {T}}]=\operatorname {E} [\mathbf {Z} ]\operatorname {E} [\mathbf {W} ]^{\rm {T}}.}

Properties

Relation to the cross-covariance matrix

The cross-correlation is related to the cross-covariance matrix as follows:

K X Y = E ⁡ [ ( X − E ⁡ [ X ] ) ( Y − E ⁡ [ Y ] ) T ] = R X Y − E ⁡ [ X ] E ⁡ [ Y ] T {\displaystyle \operatorname {K} _{\mathbf {X} \mathbf {Y} }=\operatorname {E} [(\mathbf {X} -\operatorname {E} [\mathbf {X} ])(\mathbf {Y} -\operatorname {E} [\mathbf {Y} ])^{\rm {T}}]=\operatorname {R} _{\mathbf {X} \mathbf {Y} }-\operatorname {E} [\mathbf {X} ]\operatorname {E} [\mathbf {Y} ]^{\rm {T}}} Respectively for complex random vectors: K Z W = E ⁡ [ ( Z − E ⁡ [ Z ] ) ( W − E ⁡ [ W ] ) H ] = R Z W − E ⁡ [ Z ] E ⁡ [ W ] H {\displaystyle \operatorname {K} _{\mathbf {Z} \mathbf {W} }=\operatorname {E} [(\mathbf {Z} -\operatorname {E} [\mathbf {Z} ])(\mathbf {W} -\operatorname {E} [\mathbf {W} ])^{\rm {H}}]=\operatorname {R} _{\mathbf {Z} \mathbf {W} }-\operatorname {E} [\mathbf {Z} ]\operatorname {E} [\mathbf {W} ]^{\rm {H}}}

See also

Further reading

References

  1. Gubner, John A. (2006). Probability and Random Processes for Electrical and Computer Engineers. Cambridge University Press. ISBN 978-0-521-86470-1. 978-0-521-86470-1