e-Statistics > 4470-5470 Probability and Statistics I

Textbook exercises

Reading assignment from

Robert V. Hogg, Joseph W. McKean, and Allen T. Craig, Introduction to Mathematical Statistics, 7th ed. [6th ed.] Prentice Hall, NJ.

is an integrated part of your course work, and supplementary to our course materials covered in class. It is also important for you to work on some exercises together. The numbers in [] indicate differently numbered problems in 6th edition.

Section Reading Assignment Recommended exercises
1.2 "Set Theory" up to Example 1.2.17. 1.2.1, 1.2.2(a), and 1.2.3.
1.3 The Probability Set Function 1.3.2, 1.3.3, 1.3.4,
1.3.10(a)-(b), 1.3.11, and 1.3.13(b)
[1.3.11(a)-(b), 1.3.12, and 1.3.14(b)]
1.4 Conditional Probability and Independence 1.4.4, 1.4.6, 1.4.8, 1.4.9, 1.4.20, 1.4.25 and 1.4.30. To get the idea of 1.4.30, visit here.
1.5 Random Variables 1.5.3, 1.5.4(c), 1.5.5, an extra question: what do you call the distribution in 1.5.5(a)?
1.6 "Discrete Random Variables" before 1.6.1 "Transformations" 1.6.2, 1.6.4
1.7 "Continuous Random Variables" before 1.7.1 "Transformations" 1.7.6, 1.7.12(a)-(b), 1.7.14, 1.7.10, 1.7.11(a)-(b), 1.7.19
2.1 "Distributions of Two Random Variables" before 2.1.1 "Expectation" 2.1.1, 2.1.9, 2.1.12, 2.1.13
2.3 "Conditional Distributions and Expectations" before it introduces the conditional expectation on page 95. 2.3.2, 2.3.3 Find the conditional density of X1 given X2=x2; no need to complete (a)-(c)
1.8 Expectation of a Random Variable 1.8.2, 1.8.3, 1.8.5, 1.8.6, 1.8.7, 1.8.11(a)
[1.8.3, 1.8.4, 1.8.6, 1.8.7, 1.8.8, 1.8.14(a)]
1.9 Some Special Expectations 1.9.1(a)-(b), 1.9.2, 1.9.7, 1.9.26 [1.9.25]
3.1 "The Binomial and Related Distributions" up to Example 3.1.3, and then Example 3.1.6 to see the introduction of geometric and negative binomial distribution. 3.1.1, 3.1.3, 3.1.9, 3.1.11, 3.1.14, 3.1.18
3.2 The Poisson Distribution 3.2.1, 3.2.10, 3.2.14
3.3 "The gamma and chi-square distribution" up to Example 3.3.6 3.3.1, 3.3.3; here $ \lambda = 1/\beta$ with scale parameter $ \beta$.
3.4 "The normal distribution" before Theorem 3.4.2 3.4.2-3.4.5, 3.4.12 and 3.4.13
3.3 "The gamma and chi-square distribution" up to Example 3.3.6 3.3.2, 3.3.6(a) [3.3.6], 3.3.7 together with 1.7.8(c)
3.4 "The normal distribution" before Theorem 3.4.2 3.4.8, 3.4.10
3.5 Example 3.5.1, 3.5.2 and 3.5.3
Section 15 of note01.pdf
3.5.1, 3.5.2, 3.5.5, 3.5.7, 3.5.10*
(Exercises of Section 15.5)
2.3 "Conditional Distributions and Expectations" 2.3.6
1.7.1 "Transformations" 1.7.20 and 1.7.22
2.2 "Transformations: Bivariate Random Variables" 2.2.3 and 2.2.4

Remarks.

(*) Complete 3.5.10 by answering the following questions:

  1. Let $ (X,Y)$ be bivariate normal random variables with parameters $ (\mu_x,\mu_y, \sigma_x^2, \sigma_y^2, \rho)$. Use Definition 2.1.2 in the textbook, and calculate the mgf

    $\displaystyle M_{(X,Y)}(s,t) = \exp\left[\mu_x s + \mu_y t + \frac{1}{2}\left(\sigma_x^2 s^2 + 2 \rho\sigma_x\sigma_y s t + \sigma_y^2 t^2\right)\right]
$

    (Actually it is a special case of Theorem 3.5.2 of the textbook.)
  2. In Problem 3.5.10 show that $ Z$ has the mgf

    $\displaystyle M_Z(t) = \exp\left[\frac{1}{2}(a^2+b^2+2\rho a b)t^2\right]
$


© TTU Mathematics