Movatterモバイル変換


[0]ホーム

URL:


Search for probability and statistics terms on Statlect
StatLect
Index >Fundamentals of probability

Independent random variables

by, PhD

Two random variables are independent if they convey no information about each other and, as a consequence, receiving information about one of the two does not change our assessment of the probability distribution of the other.

This lecture provides a formal definition of independence and discusses how to verify whether two or more random variables are independent.

Table of Contents

Table of contents

  1. Definition

  2. Independence criterion

  3. Independence between discrete random variables

  4. Independence between continuous random variables

  5. More details

    1. Mutually independent random variables

    2. Mutual independence via expectations

    3. Independence and zero covariance

    4. Independent random vectors

    5. Mutually independent random vectors

  6. Solved exercises

    1. Exercise 1

    2. Exercise 2

    3. Exercise 3

Definition

Recall (see the lecture entitledIndependent events) that two eventsA and$B$ are independent if and only if[eq1]

This definition is extended to random variables as follows.

Definition Two random variablesX andY are said to beindependent if and only if[eq2]for any couple of events[eq3]and[eq4], where$Asubseteq $R and$Bsubseteq $R.

In other words, two random variables are independent if and only if the events related to those random variables are independent events.

The independence between two random variables is also called statistical independence.

Independence criterion

Checking the independence of all possible couples of events related to two random variables can be very difficult. This is the reason why the above definition is seldom used to verify whether two random variables are independent. The following criterion is more often used instead.

Proposition Two random variablesX andY are independent if and only if[eq5]where[eq6] is theirjoint distribution function and[eq7] and[eq8] are theirmarginal distribution functions/.

Proof

By using some facts from measure theory (not proved here), it is possible to demonstrate that, when checking for the condition[eq9]it is sufficient to confine attention to setsA and$B$ taking the form[eq10]Thus, two random variables are independent if and only if[eq11]Using the definitions of joint and marginal distribution function, this condition can be written as[eq12]

Example LetX andY be two random variables with marginal distribution functions[eq13]and joint distribution function[eq14]X andY are independent if and only if[eq15]which is straightforward to verify. When$x<0$ or$y<0$, then[eq16] When$xgeq 0$ and$ygeq 0$, then:[eq17]

Independence between discrete random variables

When the two variables, taken together, form adiscrete random vector, independence can also be verified using the following proposition:

PropositionTwo random variablesX andY, forming a discrete random vector, are independent if and only if[eq18]where[eq19] is theirjoint probability mass function and[eq20] and[eq21] are theirmarginal probability mass functions.

The following example illustrates how this criterion can be used.

Example Let[eq22] be a discrete random vector with support[eq23]Let its joint probability mass function be[eq24]In order to verify whetherX andY are independent, we first need to derive the marginal probability mass functions ofX andY. The support ofX is[eq25]and the support ofY is[eq26]We need to compute the probability of each element of the support ofX:[eq27]Thus, the probability mass function ofX is[eq28]We need to compute the probability of each element of the support ofY:[eq29]Thus, the probability mass function ofY is[eq30]The product of the marginal probability mass functions is[eq31]which is obviously different from[eq32]. Therefore,X andY are not independent.

Independence between continuous random variables

When the two variables, taken together, form acontinuous random vector, independence can also be verified by means of the following proposition.

PropositionTwo random variablesX andY, forming a continuous random vector, are independent if and only if[eq33]where[eq34] is theirjoint probability density function and[eq35] and[eq36] are theirmarginal probability density functions.

The following example illustrates how this criterion can be used.

Example Let the joint probability density function ofX andY be[eq37]Its marginals are[eq38]and[eq39]Verifying that[eq40] is straightforward. When[eq41] or[eq42], then[eq43]. When[eq44] and[eq45], then[eq46]

More details

The following subsections contain more details about statistical independence.

Mutually independent random variables

The definition of mutually independent random variables extends the definition of mutually independent events to random variables.

Definition We say thatn random variablesX_1, ...,X_n aremutually independent(or jointly independent) if and only if[eq47]for any sub-collection ofk random variables$X_{i_{1}}$, ...,$X_{i_{k}}$ (where$kleq n$) and for any collection of events[eq48], where[eq49].

In other words,n random variables are mutually independent if the events related to those random variables aremutually independent events.

Denote byX a random vector whose components areX_1, ...,X_n. The above condition for mutual independence can be replaced:

  1. in general, by a condition on the joint distribution function ofX:[eq50]

  2. for discrete random variables, by a condition on the joint probability mass function ofX:[eq51]

  3. for continuous random variables, by a condition on the joint probability density function ofX:[eq52]

Mutual independence via expectations

It can be proved thatn random variablesX_1, ...,X_n are mutually independent if and only if[eq53]for anyn functions$g_{1}$, ...,$g_{n}$ such that the above expected values exist and are well-defined.

Independence and zero covariance

If two random variablesX_1 andX_2 are independent, then theircovariance is zero:[eq54]

Proof

This is an immediate consequence of the fact that, ifX_1 andX_2 are independent, then[eq55](see theMutual independence via expectations property above). When$g_{1}$ and$g_{2}$ are identity functions ([eq56] and[eq57]), then[eq58]Therefore, by thecovariance formula:[eq59]

The converse is not true: two random variables that have zero covariance are not necessarily independent.

Independent random vectors

The above notions are easily generalized to the case in whichX andY are two random vectors, having dimensions$K_{X}imes 1$ and$K_{Y}imes 1$ respectively. Denote their joint distribution functions by[eq60] and[eq61] and the joint distribution function ofX andY together by[eq62]Also, if the two vectors are discrete or continuous replaceF withp or$f$ to denote the corresponding probability mass or density functions.

Definition Two random vectorsX andY are independent if and only if one of the following equivalent conditions is satisfied:

  1. Condition 1:[eq9]for any couple of events[eq3]and[eq4], where$Asubseteq $$U{211d} ^{K_{X}}$ and$Bsubseteq $$U{211d} ^{K_{Y}}$:

  2. Condition 2:[eq66]for any[eq67] and$yin $$U{211d} ^{K_{Y}}$ (replaceF withp or$f$ when the distributions are discrete or continuous respectively)

  3. Condition 3:[eq68]for any functions$g_{1}:$[eq69]R and$g_{2}:$[eq70]R such that the above expected values exist and are well-defined.

Mutually independent random vectors

Also the definition of mutual independence extends in a straightforward manner to random vectors.

Definition We say thatn random vectorsX_1, ...,X_n aremutually independent(or jointly independent) if and only if[eq71]for any sub-collection ofk random vectors$X_{i_{1}}$, ...,$X_{i_{k}} $ (where$kleq n$) and for any collection of events[eq72].

All the equivalent conditions for the joint independence of a set of random variables (see above) apply with obvious modifications also to random vectors.

Solved exercises

Below you can find some exercises with explained solutions.

Exercise 1

Consider two random variablesX andY having marginal distribution functions[eq73]IfX andY are independent, what is their joint distribution function?

Solution

ForX andY to be independent, their joint distribution function must be equal to the product of their marginal distribution functions:[eq74]

Exercise 2

Let[eq75] be a discrete random vector with support:[eq76]Let its joint probability mass function be[eq77]AreX andY independent?

Solution

In order to verify whetherX andY are independent, we first need to derive the marginal probability mass functions ofX andY. The support ofX is[eq78]and the support ofY is[eq79]We need to compute the probability of each element of the support ofX:[eq80]Thus, the probability mass function ofX is[eq81]We need to compute the probability of each element of the support ofY:[eq82]Thus, the probability mass function ofY is[eq83]The product of the marginal probability mass functions is[eq84]which is equal to[eq32]. Therefore,X andY are independent.

Exercise 3

Let[eq86] be a continuous random vector with support[eq87]and its joint probability density function be[eq88]AreX andY independent?

Solution

The support ofY is[eq89]When[eq90], the marginal probability density function ofY is0, while, when[eq91], the marginal probability density function ofY is[eq92]Thus, summing up, the marginal probability density function ofY is[eq93]The support ofX is[eq94]When[eq95], the marginal probability density function ofX is0, while, when[eq96], the marginal probability density function ofX is[eq97]Thus, the marginal probability density function ofX is[eq98]Verifying that[eq40] is straightforward. When[eq95] or[eq101], then[eq43]. When[eq103] and[eq104], then[eq105]Thus,X andY are independent.

How to cite

Please cite as:

Taboga, Marco (2021). "Independent random variables", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/fundamentals-of-probability/independent-random-variables.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.

Probability and statisticsMatrix algebra
Featured pages
Explore
Main sections
About
Glossary entries
Share
  • To enhance your privacy,
  • we removed the social buttons,
  • butdon't forget to share.

[8]ページ先頭

©2009-2025 Movatter.jp