If X Has a Finite First Moment Then Its Characteristic Function is Continuously Differentiable

The joint characteristic function (joint cf) of a random vector is a multivariate generalization of the characteristic function of a random variable.

Table of Contents

Table of contents

  1. Definition

  2. Deriving cross-moments

  3. Characterizing joint distributions

  4. More details

    1. Joint cf of a linear transformation

    2. Joint cf of a random vector with independent entries

    3. Joint cf of a sum of mutually independent random vectors

  5. Solved exercises

    1. Exercise 1

    2. Exercise 2

    3. Exercise 3

  6. References

Here is a definition.

Definition Let X be a Kx1 random vector. The joint characteristic function of X is a function [eq1] defined by [eq2] where $i=sqrt{-1}$ is the imaginary unit.

Observe that [eq3] exists for any $tin U{211d} ^{K}$ because [eq4] and the expected values appearing in the last line are well-defined, because both the sine and the cosine are bounded (they take values in the interval [eq5] ).

Like the joint moment generating function of a random vector, the joint cf can be used to derive the cross-moments of X , as stated below.

Proposition Let X be a random vector and [eq6] its joint characteristic function. Let $nin U{2115} $ . Define a cross-moment of order n as follows: [eq7] where [eq8] and [eq9] . If all cross-moments of order n exist and are finite, then all the n -th order partial derivatives of [eq10] exist and [eq11] where the partial derivative on the right-hand side of the equation is evaluated at the point $t_{1}=0$ , $t_{2}=0$ , ..., $t_{K}=0$ .

Proof

When we need to derive a cross-moment of a random vector, the practical usefulness of this proposition is somewhat limited, because it is seldom known, a priori, whether cross-moments of a given order exist or not.

The following proposition, instead, does not require such a priori knowledge.

Proof

Again, see Ushakov (1999).

The joint cf can also be used to check whether two random vectors have the same distribution.

Proposition Let X and Y be two Kx1 random vectors. Denote by [eq17] and [eq18] their joint distribution functions and by [eq19] and [eq20] their joint cfs. Then, [eq21]

Proof

The proof can be found in Ushakov (1999).

Stated differently, two random vectors have the same distribution if and only if they have the same joint cf.

This result is frequently used in applications because demonstrating equality of two joint cfs is often much easier than demonstrating equality of two joint distribution functions.

The following sections contain more detail about the joint characteristic function.

Joint cf of a linear transformation

Let X be a Kx1 random vector with characteristic function [eq22] .

Define [eq23] where A is a $L	imes 1$ constant vector and $B$ is a $L	imes K$ constant matrix.

Then, the joint cf of Y is [eq24]

Proof

This is proved as follows: [eq25]

Joint cf of a random vector with independent entries

Let X be a Kx1 random vector.

Let its entries X_1 , ..., $X_{K}$ be K mutually independent random variables.

Denote the cf of the $j$ -th entry of X by [eq26] .

Then, the joint cf of X is [eq27]

Proof

This is demonstrated as follows: [eq28]

Joint cf of a sum of mutually independent random vectors

Let X_1 , ..., X_n be n mutually independent random vectors.

Let Z be their sum: [eq29]

Then, the joint cf of Z is the product of the joint cfs of X_1 , ..., X_n : [eq30]

Proof

Similar to the previous proof: [eq31]

Some solved exercises on joint characteristic functions can be found below.

Exercise 1

Let $Z_{1}$ and $Z_{2}$ be two independent standard normal random variables.

Let X be a $2	imes 1$ random vector whose components are defined as follows: [eq32]

Derive the joint characteristic function of X .

Hint: use the fact that $Z_{1}^{2}$ and $Z_{2}^{2}$ are two independent Chi-square random variables having characteristic function [eq33]

Solution

By using the definition of characteristic function, we get [eq34]

Exercise 2

Use the joint characteristic function found in the previous exercise to derive the expected value and the covariance matrix of X .

Solution

We need to compute the partial derivatives of the joint characteristic function: [eq35] All partial derivatives up to the second order exist and are well defined. As a consequence, all cross-moments up to the second order exist and are finite and they can be computed from the above partial derivatives: [eq36] The covariances are derived as follows: [eq37] So, summing up, we get [eq38]

Exercise 3

Read and try to understand how the joint characteristic function of the multinomial distribution is derived in the lecture entitled Multinomial distribution.

Ushakov, N. G. (1999) Selected topics in characteristic functions, VSP.

Please cite as:

Taboga, Marco (2021). "Joint characteristic function", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/fundamentals-of-probability/joint-characteristic-function.

nortonbaimet.blogspot.com

Source: https://statlect.com/fundamentals-of-probability/joint-characteristic-function

0 Response to "If X Has a Finite First Moment Then Its Characteristic Function is Continuously Differentiable"

Enregistrer un commentaire

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel