TAILIEUCHUNG - Cl08-Wackerly-MathStat7-wm2

of and Relative The Rao–Blackwell Theorem and Minimum-Variance Unbiased The Method of The Method of Maximum Some Large-Sample Properties of Maximum-Likelihood Estimators (Optional). and Further Chapter 8, we presented some intuitive estimators for parameters often of practical problems. An estimator θ for a target parameter θ is a function of variables observed in a sample and therefore is itself a random variableConsequently, an estimator has a probability distribution, the sampling the estimator. We noted in Section that, if E(θ) = θ, then the estimator has the.(sometimes) desirable property of being unbiasedIn this chapter, we undertake a more formal and detailed examination of some of properties of point estimators—particularly the notions of efficiency,.consistency, and sufficiency. We present a result, the Rao–Blackwell theorem, a link between sufficient statistics and unbiased estimators for parametersGenerally speaking, an unbiased estimator with small variance is or can be made to Efficiency445a function of a sufficient statistic. We also demonstrate a method that can used to find minimum-variance unbiased estimators for parameters of interest. offer two other useful methods for deriving estimators: the method of the method of maximum likelihood. Some properties of estimators derived methods are Relative usually is possible to obtain more than one unbiased estimator for the same θ . In Section (Figure ), we mentioned that if θ1 and θ2 denote estimators for the same parameter θ, we prefer to use the estimator smaller variance. That is, if both estimators are unbiased, θ1 is relatively if V (θ2 ) > V (θ1 ). In fact, we use the ratio V (θ2 )/V (θ1 ) to define than efficiency of two unbiased two unbiased estimators θ1 and θ2 of a parameter θ, with ) and V (θ2 ), respectively, then the efficiency of θ1 relative to θ2 , ( , θ2 ), is defined to be the ( (θ2 ).ˆ (θ1 , θ2 ) =. (θ1 ) θ1 and θ2 are unbiased estimators for θ, the efficiency of θ1 relative to θ2 ,. (θ1 , θ2 ), is greater than 1 only if V (θ2 ) > V (θ1 ). In this case, θ1 is a better than θ2 . For example, if eff (θ1 , θ2 ) = , then V (θ2 ) = ()V (θ1 ), is preferred to θ2 . Similarly, if eff (θ1 , θ2 ) is less than 1—say, .73—then V (θ2 ) =..(.73)V (θ1 ), and θ2 is preferred to θ1 . Let us consider an example involving estimators for a population mean. Suppose that we wish to estimate of a normal population. Let θ1 be the sample median, the middle the sample measurements are ordered according to magnitude (n odd) or of the two middle observations (n even). Let θ2 be the sample mean. is omitted, it can be shown that the variance of the sample median, for , is V (θ1 ) = ()2 (σ 2 /n). Then the efficiency of the sample median relative sample mean (θ2 ).σ 2 / (θ1 , θ2 ) =.=.= .6366=.ˆ1 ).()2 σ 2 /n.() (, we see that the variance of the sample mean is approximately 64% of of the sample median. Therefore, we would prefer to use the sample the estimator for the population 9Properties of Point Estimators and Methods of EstimationE X A M PL E Y1 , Y2 , . . . , Yn denote

TỪ KHÓA LIÊN QUAN
TAILIEUCHUNG - Chia sẻ tài liệu không giới hạn
Địa chỉ : 444 Hoang Hoa Tham, Hanoi, Viet Nam
Website : tailieuchung.com
Email : tailieuchung20@gmail.com
Tailieuchung.com là thư viện tài liệu trực tuyến, nơi chia sẽ trao đổi hàng triệu tài liệu như luận văn đồ án, sách, giáo trình, đề thi.
Chúng tôi không chịu trách nhiệm liên quan đến các vấn đề bản quyền nội dung tài liệu được thành viên tự nguyện đăng tải lên, nếu phát hiện thấy tài liệu xấu hoặc tài liệu có bản quyền xin hãy email cho chúng tôi.
Đã phát hiện trình chặn quảng cáo AdBlock
Trang web này phụ thuộc vào doanh thu từ số lần hiển thị quảng cáo để tồn tại. Vui lòng tắt trình chặn quảng cáo của bạn hoặc tạm dừng tính năng chặn quảng cáo cho trang web này.