TAILIEUCHUNG - Ebook Probability and mathematical statistics: Part 2
Part 2 book “Probability and mathematical statistics” has contents: Some special discrete bivariate distributions, simple linear regression and correlation analysis, sequences of random variables and order statistics, some techniques for finding interval estimators of parameters, and other contents. | Some Special Discrete Bivariate Distributions 290 Chapter 11 SOME SPECIAL DISCRETE BIVARIATE DISTRIBUTIONS In this chapter, we shall examine some bivariate discrete probability density functions. Ever since the first statistical use of the bivariate normal distribution (which will be treated in Chapter 12) by Galton and Dickson in 1886, attempts have been made to develop families of bivariate distributions to describe non-normal variations. In many textbooks, only the bivariate normal distribution is treated. This is partly due to the dominant role the bivariate normal distribution has played in statistical theory. Recently, however, other bivariate distributions have started appearing in probability models and statistical sampling problems. This chapter will focus on some well known bivariate discrete distributions whose marginal distributions are wellknown univariate distributions. The book of . Mardia gives an excellent exposition on various bivariate distributions. . Bivariate Bernoulli Distribution We define a bivariate Bernoulli random variable by specifying the form of the joint probability distribution. Definition . A discrete bivariate random variable (X, Y ) is said to have the bivariate Bernoulli distribution if its joint probability density is of the form 8 < x! y! (11 x y)! px1 py2 (1 p1 p2 )1 x y , if x, y = 0, 1 f (x, y) = : 0 otherwise, Probability and Mathematical Statistics 291 where 0 < p1 , p2 , p1 + p2 < 1 and x + y 1. We denote a bivariate Bernoulli random variable by writing (X, Y ) ⇠ BER (p1 , p2 ). In the following theorem, we present the expected values and the variances of X and Y , the covariance between X and Y , and their joint moment generating function. Recall that the joint moment generating function of X and Y is defined as M (s, t) := E esX+tY . Theorem . Let (X, Y ) ⇠ BER (p1 , p2 ), where p1 and p2 are parameters. Then E(X) = p1 E(Y ) = p2 V ar(X) = p1 (1 p1 ) V ar(Y ) = p2 (1 p2 ) Cov(X, Y ) = p1 .
đang nạp các trang xem trước