In probability theory and statistics, a complexrandom vector is typically a tuple of complex-valued random variables, and generally is a random variable taking values in a vector space over the field of complex numbers. If are complex-valued random variables, then the n-tuple is a complex random vector. Complex random variables can always be considered as pairs of real random vectors: their real and imaginary parts. Some concepts of real random vectors have a straightforward generalization to complex random vectors. For example, the definition of the [|mean] of a complex random vector. Other concepts are unique to complex random vectors. Applications of complex random vectors are found in digital signal processing.
The generalization of the cumulative distribution function from real to complex random variables is not obvious because expressions of the form make no sense. However expressions of the form make sense. Therefore, the cumulative distribution function of a random vector is defined as where.
Expectation
As in the real case the expectation of a complex random vector is taken component-wise.
The covariance matrix contains the covariances between all pairs of components. The covariance matrix of an random vector is an matrix whose th element is the covariance between the i th and the j th random variables. Unlike in the case of real random variables, the covariance between two random variables involves the complex conjugate of one of the two. Thus the covariance matrix is a Hermitian matrix. The pseudo-covariance matrix is defined as follows. In contrast to the covariance matrix defined above [|Hermitian transposition] gets replaced by [|transposition] in the definition.
By decomposing the random vector into its real part and imaginary part , the matrices and can be related to the covariance matrices of and via the following expressions: and conversely
The cross-covariance matrix between two complex random vectors is defined as: And the pseudo-cross-covariance matrix is defined as:
Uncorrelatedness
Two complex random vectors and are called uncorrelated if
Independence
Two complex random vectors and are called independent if where and denote the cumulative distribution functions of and as defined in and denotes their joint cumulative distribution function. Independence of and is often denoted by. Written component-wise, and are called independent if
Circular symmetry
Definition
A complex random vector is called circularly symmetric if for every deterministic the distribution of equals the distribution of.
Properties
The expectation of a circularly symmetric complex random vectors is either zero or it is not defined.
The pseudo-covariance matrix of a circularly symmetric complex random vectors is zero.
Proper complex random vectors
Definition
A complex random vector is called proper if the following three conditions are all satisfied:
Two complex random vectors are called jointly proper is the composite random vector is proper.
Linear transformations of proper complex random vectors are proper, i.e. if is a proper random vectors with components and is a deterministic matrix, then the complex random vector is also proper.
Every circularly symmetric complex random vector with finite variance of all its components is proper.
There are proper complex random vectors that are not circularly symmetric.
A real random vector is proper if and only if it is constant.
Two jointly proper complex random vectors are uncorrelated if and only if their covariace matrix is zero, i.e. if.