Collection of notes for various classes I've taken.
The Addition Rule of Probability is a fundamental principle for finding the probability of the union of two events, $A$ and $B$. It states that the probability of either event $A$ or event $B$ occurring is the sum of their individual probabilities minus the probability of their intersection. This subtraction is crucial because it prevents double-counting the outcomes that are common to both events.
\[P(A \cup B) = P(A) + P(B) - P(A \cap B)\]De Morgan’s Laws provide a way to relate the union and intersection of events and their complements. The first law states that the complement of the union of two events is the intersection of their complements. The second law (which is the one you listed) states that the complement of the intersection of two events is the union of their complements.
\[P(A^{C}\cup B^{C}) = P((A\cap B)^{C}) = 1 - P(A \cap B)\]Permutations and Combinations are methods for counting the number of possible arrangements of objects. The key difference lies in whether the order of selection matters.
Combinations ($C(n,k)$): The number of ways to choose $k$ items from a set of $n$ items, where the order does not matter. This is used for committees, groups, or selections.
\[C(n,k) = \binom{n}{k} = \frac{n!}{k!(n-k)!}\]Permutations ($P(n, k)$): The number of ways to arrange $k$ items from a set of $n$ items, where the order does matter. This is used for rankings, codes, or ordered arrangements.
\[P(n, k) = \frac{n!}{(n-k)!}\]Conditional Probability is the probability of an event occurring given that another event has already occurred. It is denoted as $P(A | B)$, which is read as “the probability of A given B”. |
Independence is a special case where the occurrence of one event does not affect the probability of the other. Two events $A$ and $B$ are independent if and only if one of the following equivalent conditions is met:
$P(A \cap B) = P(A) \cdot P(B)$
Corollary: If $A$ and $B$ are independent, so are: