Movatterモバイル変換


[0]ホーム

URL:


Search for probability and statistics terms on Statlect
StatLect
Index >Fundamentals of probability

Joint characteristic function

by, PhD

The joint characteristic function (joint cf) of arandom vector is a multivariate generalization of thecharacteristic function of arandom variable.

Table of Contents

Table of contents

  1. Definition

  2. Deriving cross-moments

  3. Characterizing joint distributions

  4. More details

    1. Joint cf of a linear transformation

    2. Joint cf of a random vector with independent entries

    3. Joint cf of a sum of mutually independent random vectors

  5. Solved exercises

    1. Exercise 1

    2. Exercise 2

    3. Exercise 3

  6. References

Definition

Here is a definition.

Definition LetX be aKx1 random vector. Thejoint characteristic function ofX is a function[eq1] defined by[eq2]where$i=sqrt{-1}$ is the imaginary unit.

Observe that[eq3] exists for any$tin U{211d} ^{K}$ because[eq4]and the expected values appearing in the last line are well-defined, because both the sine and the cosine are bounded (they take values in the interval[eq5]).

Deriving cross-moments

Like thejoint moment generating function of a random vector, the joint cf can be used to derive thecross-moments ofX, as stated below.

Proposition LetX be a random vector and[eq6] its joint characteristic function. Let$nin U{2115} $. Define a cross-moment of ordern as follows:[eq7]where[eq8] and[eq9]. If all cross-moments of ordern exist and are finite, then all then-th order partial derivatives of[eq10] exist and[eq11]where the partial derivative on the right-hand side of the equation is evaluated at the point$t_{1}=0$,$t_{2}=0$, ...,$t_{K}=0$.

Proof

SeeUshakov (1999).

When we need to derive a cross-moment of a random vector, the practical usefulness of this proposition is somewhat limited, because it is seldom known, a priori, whether cross-moments of a given order exist or not.

The following proposition, instead, does not require such a priori knowledge.

Proposition LetX be a random vector and[eq12] its joint cf. If all then-th order partial derivatives of[eq13] exist, then

  1. ifn iseven, for any[eq14] all$m$-th cross-moments ofX exist and are finite;

  2. ifn isodd, for any[eq15] all$m$-th cross-moments ofX exist and are finite.

In both cases, we have that[eq16]where the partial derivatives on the right-hand sides of the equations above are evaluated at the point$t_{1}=0$,$t_{2}=0$, ...,$t_{K}=0$.

Proof

Again, seeUshakov (1999).

Characterizing joint distributions

The joint cf can also be used to check whether two random vectors have the same distribution.

PropositionLetX andY be twoKx1 random vectors. Denote by[eq17] and[eq18] theirjoint distribution functions and by[eq19] and[eq20] their joint cfs. Then,[eq21]

Proof

The proof can be found inUshakov (1999).

Stated differently, two random vectors have the same distribution if and only if they have the same joint cf.

This result is frequently used in applications because demonstrating equality of two joint cfs is often much easier than demonstrating equality of two joint distribution functions.

More details

The following sections contain more detail about the joint characteristic function.

Joint cf of a linear transformation

LetX be aKx1 random vector with characteristic function[eq22].

Define[eq23]whereA is a$Limes 1$ constant vector and$B$ is a$Limes K$ constant matrix.

Then, the joint cf ofY is[eq24]

Proof

This is proved as follows:[eq25]

Joint cf of a random vector withindependent entries

LetX be aKx1 random vector.

Let its entriesX_1, ...,$X_{K}$ beKmutually independent random variables.

Denote the cf of the$j$-th entry ofX by[eq26].

Then, the joint cf ofX is[eq27]

Proof

This is demonstrated as follows:[eq28]

Joint cf of a sum of mutuallyindependent random vectors

LetX_1, ...,X_n ben mutually independent random vectors.

LetZ be their sum:[eq29]

Then, the joint cf ofZ is the product of the joint cfs ofX_1, ...,X_n:[eq30]

Proof

Similar to the previous proof:[eq31]

Solved exercises

Some solved exercises on joint characteristic functions can be found below.

Exercise 1

Let$Z_{1}$ and$Z_{2}$ be two independentstandard normal random variables.

LetX be a$2imes 1$ random vector whose components are defined as follows:[eq32]

Derive the joint characteristic function ofX.

Hint: use the fact that$Z_{1}^{2}$ and$Z_{2}^{2}$ are two independentChi-square random variables having characteristic function[eq33]

Solution

By using the definition of characteristic function, we get[eq34]

Exercise 2

Use the joint characteristic function found in the previous exercise to derive the expected value and the covariance matrix ofX.

Solution

We need to compute the partial derivatives of the joint characteristic function:[eq35]All partial derivatives up to the second order exist and are well defined. As a consequence, all cross-moments up to the second order exist and are finite and they can be computed from the above partial derivatives:[eq36]The covariances are derived as follows:[eq37]So, summing up, we get[eq38]

Exercise 3

Read and try to understand how the joint characteristic function of the multinomial distribution is derived in the lecture entitledMultinomial distribution.

References

Ushakov, N. G. (1999)Selected topics in characteristic functions, VSP.

How to cite

Please cite as:

Taboga, Marco (2021). "Joint characteristic function", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/fundamentals-of-probability/joint-characteristic-function.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.

Probability and statisticsMatrix algebra
Featured pages
Explore
Main sections
About
Glossary entries
Share
  • To enhance your privacy,
  • we removed the social buttons,
  • butdon't forget to share.

[8]ページ先頭

©2009-2025 Movatter.jp