In signal processing, cross-correlation is a measure of similarity of two series as a function of the displacement of one relative to the other. This is also known as a sliding dot product or sliding inner-product. It is commonly used for searching a long signal for a shorter, known feature. It has applications in pattern recognition, single particle analysis, electron tomography, averaging, cryptanalysis, and neurophysiology. The cross-correlation is similar in nature to the convolution of two functions. In an autocorrelation, which is the cross-correlation of a signal with itself, there will always be a peak at a lag of zero, and its size will be the signal energy.
In probability and statistics, the term cross-correlations refers to the correlations between the entries of two random vectors X {\displaystyle \mathbf {X} } and Y {\displaystyle \mathbf {Y} } , while the correlations of a random vector X {\displaystyle \mathbf {X} } are the correlations between the entries of X {\displaystyle \mathbf {X} } itself, those forming the correlation matrix of X {\displaystyle \mathbf {X} } . If each of X {\displaystyle \mathbf {X} } and Y {\displaystyle \mathbf {Y} } is a scalar random variable which is realized repeatedly in a time series, then the correlations of the various temporal instances of X {\displaystyle \mathbf {X} } are known as autocorrelations of X {\displaystyle \mathbf {X} } , and the cross-correlations of X {\displaystyle \mathbf {X} } with Y {\displaystyle \mathbf {Y} } across time are temporal cross-correlations. In probability and statistics, the definition of correlation always includes a standardising factor in such a way that correlations have values between −1 and +1.
If X {\displaystyle X} and Y {\displaystyle Y} are two independent random variables with probability density functions f {\displaystyle f} and g {\displaystyle g} , respectively, then the probability density of the difference Y − X {\displaystyle Y-X} is formally given by the cross-correlation (in the signal-processing sense) f ⋆ g {\displaystyle f\star g} ; however, this terminology is not used in probability and statistics. In contrast, the convolution f ∗ g {\displaystyle f*g} (equivalent to the cross-correlation of f ( t ) {\displaystyle f(t)} and g ( − t ) {\displaystyle g(-t)} ) gives the probability density function of the sum X + Y {\displaystyle X+Y} .