Probability
Inclusion-Exclusion Rule
Inclusion-Exclusion Rule
For any two events \(A\) and \(B\), the probability that either \(A\) or \(B\) will occur is given by:
where \(\mathbb{P}(A \cdot B)\) or \(\mathbb{P}(A \cap B)\) is the intersection of the two events, i.e., both events \(A\) and \(B\) occur.
If events \(A\) and \(B\) are mutually exclusive, then \(\mathbb{P}(A \cdot B) = 0\), and we get:
The rule can be extended to unions of an arbitrary number of events. For example, for three events \(A\), \(B\), and \(C\), we get:
Event Complement
For every event defined on a space of elementary outcomes, \(S\), we can define a counterpart event called its compliment. The complement \(A^c\) of an event \(A\) consists of all outcomes that are in \(S\) but are not in \(A\). Events \(A\) and \(A^c\) are mutually exclusive by definition. Consequently:
Complementary Events
For any pair of complementary events \(A\) and \(A^c\):
Conditional Probability
Conditional Probability
Given \(\mathbb{P}(B) > 0\), the conditional probability of an event \(A\) given that \(B\) has occured is:
Figure 1 shows a graphical description of a conditional probability \(\mathbb{P}(A | B)\). Once event \(A\) is conditioned by \(B\), \(B\) "becomes the sample space" and \(B\)'s Venn diagram expands by a factor of \(\frac{1}{\mathbb{P}(B)}\). The intersection \(AB\) in the expanded \(B\) becomes the event \(A|B\).
Prior and Posterior Probabilities
The probability of a single event (\(\mathbb{P}(A)\) or \(\mathbb{P}(B)\)) is called a prior probability because it applies to the probability of an event apart from any previously known information.
A conditional probability (e.g., \(\mathbb{P}(A|B)\)) is called a posterior probability because it applies to a probability given the fact that some information about a possibly related event is already known.
Independence
If \(A\) and \(B\) are independent, then we have:
Equivalently, \(A\) and \(B\) are independent if:
Chain Rule
Chain Rule
The probability that events \(A\) and \(B\) will both occur is:
Generalizing the chain rule to \(n\) events:
Rule of Total Probability and Hypotheses
The rule of total probability expresses the probability of an event \(A\) as the weighted average of its conditional probabilities. The events that \(A\) is condition upon need to be exclusive and should partition the sampe space \(S\).
Events \(H_1\), \(H_2\), \(\ldots\), \(H_n\) form a partition of the sample space \(S\) iff:
- They are mutually exclusive (\(H_i \cdot H_j = \emptyset, \ i \neq j\))
- Their union is the sample space \(S\), i.e., \(\cup^n_{i = 1} H_i = S\).
The events \(H_1\), \(\ldots\), \(H_n\) are called hypotheses. By definition, we have:
Total Probability
Let the event of interest \(A\) happen under any of the hypotheses \(H_i\) with a known (conditional) probability \(\mathbb{P}(A|H_i)\). Assume, in addition, that the probability of hypotheses \(H_1\), \(\ldots\), \(H_n\) are known. Then we have: