Nate's Notes

Collection of notes for various classes I've taken.

Buy Me A Coffee

September 1

Set Operations and Probability

The Addition Rule of Probability is a fundamental principle for finding the probability of the union of two events, $A$ and $B$. It states that the probability of either event $A$ or event $B$ occurring is the sum of their individual probabilities minus the probability of their intersection. This subtraction is crucial because it prevents double-counting the outcomes that are common to both events.

\[P(A \cup B) = P(A) + P(B) - P(A \cap B)\]

De Morgan’s Laws for Probability

De Morgan’s Laws provide a way to relate the union and intersection of events and their complements. The first law states that the complement of the union of two events is the intersection of their complements. The second law (which is the one you listed) states that the complement of the intersection of two events is the union of their complements.

\[P(A^{C}\cup B^{C}) = P((A\cap B)^{C}) = 1 - P(A \cap B)\]

Permutations and Combinations

Permutations and Combinations are methods for counting the number of possible arrangements of objects. The key difference lies in whether the order of selection matters.

Conditional Probability and Independence

Conditional Probability is the probability of an event occurring given that another event has already occurred. It is denoted as $P(A B)$, which is read as “the probability of A given B”.
\[P(A|B) = \frac{P(A \cap B)}{P(B)} , \quad P(B) > 0\]

Independence is a special case where the occurrence of one event does not affect the probability of the other. Two events $A$ and $B$ are independent if and only if one of the following equivalent conditions is met:

Corollary: If $A$ and $B$ are independent, so are: