# Introduction to Probability and Statistics: Chapter 2

### Terminology

• Experiment: a repeatable procedure with well-defined possible outcomes.
• Sample space: the set of all possible outcomes, usually denoted by $\Omega$ or $S$.
• Event: a subset of sample space.
• Probability function: a function giving the probability outcome.

### Definition of a discrete sample space

Definition: A discrete sample space is one that is listable, it can be finite or infinite.

### A more precise definition of probability function

For a discrete sample space $S$ a probability function $P$ assigns each outcome $\omega$ a probability, which the function itself must conform to the following two rules.

• $0 \leq P(\omega) \leq 1$
• $\sum_{j = 1}^{n}P(\omega_j) = 1$

Note: The second rule can also be interpreted as $\sum_{\omega \in E} P(\omega)$.

### Rules of probability

• Rule 1: $P(A^c) = 1 – P(A)$
• Rule 2: If $L$ and $R$ are disjoint, then $P(L \cup R) = P(L) + P(R)$
• Rule 3: If $L$ and $R$ are not disjoint, then $P(L \cup R) = P(L) + P(R) – P(L \cap R)$

Note: Rule 3 is actually a more generalized version of Rule 2. Because if $L$ and $R$ are disjoint, then $P(L\cap R) = 0$.