Introduction to Probability Models
Lecture 23
Qi Wang, Department of Statistics
Oct 17, 2018
Last Three Discrete Random Variables
- Poisson
- Geometric
- Negative Binomial
Poisson Distribution
- $X \sim Poisson(\lambda)$
- The definition of $X$: the number of success per $\underline{\hspace{1cm}}$, and $\underline{\hspace{1cm}}$ can be time, length, space unit and so on
- Support: $\{0, 1, 2, \cdots\}$
- Parameters: $\lambda$, the average success rate per $\underline{\hspace{1cm}}$
- PMF: $P_X(x) = \frac{e^{-\lambda} \lambda^x}{x!}$
- Expected Value: $E[X] = \lambda$
- Variance: $Var(X) = \lambda$
Geometric Distribution
- $X \sim Geom(p)$
- The definition of $X$ : the number of trials to get the first success
- Support: $\{1, 2, \cdots\}$, NOTE: NO ZERO!
- Parameter: $p$, the probability of success in one trial
- PMF: $P_X(x) = p(1-p)^{x - 1}$
- Expected Value: $E[X] = \frac{1}{p}$
- Variance: $Var(X) = \frac{1 - p}{p^2}$
- Tail Probability formula: $P(X > k) = (1 - p)^k$
- Memoryless Property: $P(X > s + t| X > s) = P(X > t)$ and $P(X < s + t| X > s) = P(X < t)$
Negative Binomial Distribution
- $X \sim NegBin(r, p)$ or $X \sim NB(r, p)$
- The definition of $X$ : the number of trials to get the $r_{th}$ success
- Support: $\{r, r + 1, r + 2, \cdots\}$
- Parameter:
- $p$: the probability of success in one trial
- $r$: success of interest
- PMF: $P_X(x) = C_{r-1}^{x-1}p^r(1-p)^{x - r}$
- Expected Value: $E[X] = \frac{r}{p}$
- Variance: $Var(X) = \frac{r(1 - p)}{p^2}$
Properties of Expected Value and Variance
X, Y are random variables, c and d are constant
- $E[c] = c$
- $E[cX] = cE[X]$
- $E[X + Y] = E[X] + E[Y]$
- $E[cX + dY] = cE[X] + dE[Y]$
- $Var(X) = E[(X - E[X])^2] = E[X^2]-E[X]^2$
- $Var(c) = 0$
- $Var(cX) = c^2 Var(X)$
- If X and Y are independent, $Var(X + Y) = Var(X) + Var(Y)$
- If X and Y are independent, $Var(cX + dY) = c^2Var(X) + d^2Var(Y)$