statistics
Topics: Common Probability Distributions and Moments
Joint, Marginal, and Conditional Distributions; Probability Functions
Moments: Conditional and Marginal Expectations and Variances
Random Sums: Expectation and Variance
STAT 4382: Practice Version of Exam 1
Answers and Solutions
Yuly Koshevnik
Problem 1 [20 points]
Professor resides on campus. He walks from his residence to office in the morning and returns in the
evening.
Assume that he has four umbrellas and wants to ensure that he will not get wet during his walks on
campus. Suppose that the probability of rain in the morning and in the evening is constant, (0 < q < 1) .
Obviously rain may occur independent on the state of process X.
Consider the process, X = {X (n) : n = 0, 1, 2, . . .} where X (n) is the number of umbrellas in profes-
sors residence at time n.
When it rains, professor carries an umbrella. However, if all umbrellas are gathered in one place (home
or residence), he will carry one umbrella regardless of the rain, to ensure that there will be at least one
left in the other place.
1. In the space provided on the next page, present the transition probability matrix P for the Markov
chain, X.
2. Derive the stationary distribution, q = (q1,q2,q3,q4) , of the process X.
Recall that if the initial state, X (0) , has this distribution, then for any n, the value of X (n) will
also have the same distribution.
3. Evaluate the probability that when you occasionally meet professor, he carries an umbrella.
4. Evaluate the limiting expected number of umbrellas as n .
Solution
1. Transition probability matrix has elements described in class:
P1,3 = P1,4 = P2,4 = P3,1 = P4,1 = P4,2 = 0,
P1,1 = q + (1 q)2, P1,2 = (1 q) q,
P2,1 = q (1 q) = P2,3, P2,2 = q2 + (1 q)2,
1
P3,2 = P3,4 = q (1 q), P3,3 = q2 + (1 q)2,
and P4,3 = 1 q, P4,4 = q.
2. If the stationary distribution q = (q1,q2,q3,q4) exists, it satisfies the system of equations:
qP = q,
where P is the transition probability matrix. Take into consideration that these four equations are
linearly dependent and include the equation
4
i=1
qi = 1.
Then solve the system to obtain that
q1 = q2 = q3 and q4 = q q3.
The last equation (sum of all stationary probabilities equals 1) leads to
q1 = q2 = q3 =
1
3 + q
and q4 =
q
3 + q
3. The probability that when you occasionally meet professor, he carries one of his umbrellas is obtained
as follows. Let R be the event that rain occurred when professor left his location. Therefore, P [R] = q
and P [R] = 1 q. Consider the event C that states: the umbrella is carried. Since P [C |R] = 1
and P [C |R
(X = 4) ] = 1, obtain:
P [C] = P [R] + P
[
R
]
4 = q + (1 q) 4 =
4q
3 + q
4. Assuming that the number of umbrellas, X (n) , follows the stationary distribution, the expected
number of umbrellas at time n, as n , will be:
lim
n
E [X (n)] = 1 1 + 2 2 + 3 3 + 4 4 =
1 + 2 + 3 + 4 q
3 + q
=
6 + 4q
3 + q
Problem 2 [15 points]
A pair (X,Y ) is observable. Its joint density is
fX,Y (x,y) = f(x,y) = 25 e5y (0 < x < y < ) For each function below specify its domain restrictions, please! 2 Solution 1. Marginal densities for components are obtained by integration, fX (x) = x f(x,y) dy = 25 x e5y dy = 5e5y |x = 5e 5x valid for x > 0. Similarly,
fY (y) =
y
0
f(x,y) dx = 25 e5y y |y0 = 25y e
5y
valid for y > 0
These densities can be summarized as follows:
X Gamma
[
a = 1, b =
1
5
]
and Y Gamma
[
a = 2, b =
1
5
]
2. Conditional densities can be evaluated by formula:
fX|Y (x|y) =
f(x,y)
fY (y)
=
25 e5y
25y e5y
=
1
y
valid for 0 < x < y < . Notice that (X|Y = y) Unif [0,y]. Then acting similarly obtain: fY |X (y|x) = f(x,y) fX (x) = 25 e5y 5 e5x = 5 e5(yx) valid for 0 < x < y < . Thus (Y |X = x) has exponential density shifted x units to the right. 3. Conditional and marginal expectations are: E[X|Y = y] = y 2 valid for y > 0. Similarly,
E[Y |X = x] = x +
1
5
valid for x > 0. The established density of (Y |X = x) implies that
E[Y x|X = x] =
1
5
E[Y |X = x] = x + E[Y x|X = x].
3
Problem 3 [15 points]
Suppose that a random variable Q has density
fQ(q) = 2q for 0 < q < 1, and discrete random variable N has conditional probability mass function fN|Q(n|q) = P[N = n|Q = q] = 3! n!(3 n)! qn(1 q)3n valid for n = 0, 1, 2, 3. 1. Derive marginal expected value, E[N] 2. Evaluate variance, Var[N] 3. Given N = 1, find the conditional probability density, fQ|N (q|1). Use formulas below. E[N] = E[E[N|Q]] and Var[N] = E[Var[N|Q]] + Var[E[N|Q]] Solution Start with moments of Q and (N|Q). E[Q] = 1 0 q 2q dq = 2 3 and E[Q2] = 1 0 q2 2q dq = 1 2 . Hence conclude that Var[Q] = 1 2 ( 2 3 )2 = 1 18 . Since conditional distribution of (N|Q = q) binomial, conclude: E[N|Q = q] = 3q and Var[N|Q = q] = 3q(1 q). 1. Marginal expected value of N is E[N] = E[E[N|Q]] = E[3Q] = 3 2 3 = 2 2. Variance of N according to the formula recommended is: Var[N] = E[3Q(1 Q)] + Var[3Q] = 3 2 3 3 1 2 + (3)2 1 18 = 2 3 2 + 1 2 = 1 4 3. Given N = 1, the conditional probability density is: fQ|N (q|1) = 1 P[N = 1] fQ(q) fN|Q(1|q) = 1 P[N = 1] [3q(1 q)2] 2q. Last expression can be converted into fN|Q(q|1) = A q2 (1 q)2, valid for 0 < q < 1, where factor A does not contain q. Thus (Q|N = 1) has distribution Beta [3, 3], and its density function is fQ|N (q|1) = 5! (2!) (2!) q2 (1 q)2 = 30 q2 (1 q)2 Problem 4 [15 points] Assume that a loaded coin shows up tails twice as often as heads and N is the count of tails preceding the first head. Independent random variables {Xk : k = 1, 2, . . . ,} do not depend on N and have common density function g(x) = 4x e2x for x > 0 and g(x) = 0 elsewhere.
A random sum is defined as
S =
2N+1
k=1
Xk.
1. Evaluate expected value of S.
2. Determine variance of S.
Solution
Start with recognition of distributions and their moments. N has a geometric distribution corresponding
to success rate q = 1
3
, therefore,
E[N] =
1 q
q
= 2 and Var[N] =
1 q
q2
= 6.
X variables follow Gamma [a = 2, b = 0.5
]
, therefore
= E[X] = a b = 1 and 2 = Var[X] = a b2 =
1
2
Since the sum is taken from 1 to 2N + 1, conclude that
= E[2N + 1] = 5 and Var[2N + 1] = (2)2 Var[N] = 24
5
1. Expected value of S is
E[S] = = 5
2. Variance of S is evaluated as
Var[S] = 2 2 + 2 = (1)2 24 + 5
1
2
= 24 + 2.5 = 26.5
Problem 5 [15 points]
Consider a three-state Markov chain with the states denoted as S = {0, 1, 2} and one step transition
probabilities described as follows.
From state 0, transitions may occur to states 0 and 1, with probability = 0.5 each, while transition
to the state 2 is impossible.
From state 1, the transitions may occur to states 0 and 2, with probability = 0.5 each.
State 2 is absorbing, hence transitions from 2 to 0 or 1 are impossible.
1. Find the conditional probability,
P [X3 = 1 |X0 = 0 ]
2. Find the conditional probability,
P [X5 = 1 |X2 = 1 ]
3. Given that X1 has the discrete uniform distribution,
P [X1 = 0] = P [X1 = 1] = P [X1 = 2] =
1
3
,
determine the probability that X2 = 1.
Solution
We need to evaluate the second and third powers of the transition matrix, P verbally described in this
problem. Using notation Pi,j = P [Xn+1 = j |Xn = i] , present:
P0,0 = P0,1 =
1
2
, P0,2 = 0,
P1,0 =
1
2
, P1,1 = 0, P1,2 =
1
2
, and
P2,0 = P2,1 = 0, P2,2 = 1.
6
Matrix P2 has rows:
(P2)0,0 =
1
2
, (P2)0,1 = (P
2)0,2 =
1
4
,
(P2)1,0 = (P
2)1,1 =
1
4
, (P2)1,2 =
1
2
, and
(P2)2,0 = (P
2)2,1 = 0, (P
2)2,2 = 1.
Proceed with P3 to obtain
(P3)0,0 =
3
8
, (P3)0,1 =
2
8
, (P3)0,2 =
3
8
,
(P3)1,0 =
2
8
, (P3)1,1 =
1
8
, (P3)1,2 =
5
8
, and
(P3)2,0 = (P
3)2,1 = 0, (P
3)2,2 = 1.
1. Conditional probability, P [X3 = 1 |X0 = 0 ] , is (P3)0,1 = 14
2. Conditional probability, P [X5 = 1 |X2 = 1 ] coincides with (P3)1,1 = 18
3. Given that X1 has the discrete uniform distribution, probability that system will be in state 1 is the
product of a row with elements
(
1
3
, 1
3
, 1
3
)
by the column that corresponds the state 1 in transition
matrix, P, which becomes
P [X2 = 1] = E [P [X2 = 1 |X1 ]] =
1
3
(
1
2
+ 0 + 0
)
=
1
6
Problem 6 [10 points]
You are given the following information.
A random variable, W, has the density
fW (w) = f (w) = 3e3w for w > 0 and f (w) = 0, elsewhere.
A variable, N, given W, has a conditional Poisson distribution,
P [N = n |W = w] =
wn
n!
ew for n = 0, 1, . . . , .
1. Determine marginal distribution of N.
2. Given N = n, derive the conditional density of W.
7
Solution
1. For marginal distribution of N, start with the joint probability function, which is the product of
marginal density for W and conditional mass function of (N|W) . Therefore, the marginal distribution
of N is:
P [N = n] =
0
[
wn
n!
ew
]
3e3w dw =
3
4n+1
=
3
4
(
1
4
)n
,
valid for n = 0, 1, , . . . . This is the geometric distribution with success rate q = 0.75
2. Given N = n, the conditional density of W, is obtained by the division of the joint probability
function (which was under the integral for P [N = n] , hence the conditional density is
fW |N (w |N = n) =
[
3wn
n!
e4w
]
[
3
4n+1
]
=
4n+1
n!
wn e4w for w > 0
According to notation in the textbook and notes, (W |N = n) has the distribution Gamma
[
n + 1, 1
4
]
Problem 7 [10 points]
A random variable, W, has density
fW (w) = f (w) = 3e3w for w > 0 and f (w) = 0, elsewhere.
A variable, N, given W, has a conditional Poisson distribution,
P [N = n |W = w] =
wn
n!
ew for n = 0, 1, . . . , .
1. Evaluate marginal expectation, E[N]
2. Determine conditional expected value, E[W |N = n].
Solution
1. Marginal expectation can be found from the distribution of N obtained in Problem 6 as follows. N
has geometric distribution, starting at 0, with success rate equal to q = 0.75, therefore
E[N] =
1 q
q
=
0.25
0.75
=
1
3
2. Conditional expected value of (W |N = n can be found using the distribution of (W |N = n) found
before. Since this was Gamma[a = n + 1, b = 0.25] conclude that
E[W |N = n] = a b =
n + 1
4
= 0.25 (n+1)
8