Đang chuẩn bị nút TẢI XUỐNG, xin hãy chờ
Tải xuống
8. Để di chuyển một lệnh đến một vị trí khác nhau trên thanh công cụ, chọn nó, và sau đó nhấp vào nút hoặc , nếu thích hợp. 9. Nếu cần thiết, thay đổi hình ảnh, văn bản, hoặc cả hai cho một mục bằng cách chọn và sử dụng các lệnh trên menu lựa chọn Xem "Thay đổi Sự xuất hiện của một nút Thanh công cụ, | 42 2 Mathematical Foundations Figure 2.1 A diagram illustrating the calculation of conditional probability P A B . Once we know that the outcome is in B the probability of A becomes P A n B P B . 2.1.2 CONDITIONAL PROBABILITY PRIOR PROBABILITY POSTERIOR PROBABILITY 2.2 2.3 Conditional probability and independence Sometimes we have partial knowledge about the outcome of an experiment and that naturally influences what experimental outcomes are possible. We capture this knowledge through the notion of conditional probability. This is the updated probability of an event given some knowledge. The probability of an event before we consider our additional knowledge is called the prior probability of the event while the new probability that results from using our additional knowledge is referred to as the posterior probability of the event. Returning to example 1 the chance of getting 2 heads when tossing 3 coins if the first coin has been tossed and is a head then of the 4 remaining possible basic outcomes 2 result in 2 heads and so the probability of getting 2 heads now becomes The conditional probability of an event A given that an event has occurred P B 0 is PMI Even if P B 0 we have that P A n B P B P A B P A P B A The multiplication rule We can do the conditionalization either way because set intersection is symmetric A n B B n A . One can easily visualize this result by looking at the diagram in figure 2.1. 2.1 Elementary Probability Theory 43 The generalization of this rule to multiple events is a central result that CHAIN RULE will be used throughout this book the chain rule 2.4 P Aỵn . . ưt An P Ai P A2 Ai P A ì Ai n A2 P Anl A The chain rule is used in many places in Statistical NLP such as working out the properties of Markov models in chapter 9. INDEPENDENCE Two events A B are independent of each other if P A nU P A P B . Unless P B 0 this is equivalent to saying that P A P A B i.e. knowing that is the case does not affect the probability of A . This .