Discrete random variable

Geometric distribution

The geometric distribution describes the number of failures of independent Bernoulli trials with success probability before the first success. prob It has the probability mass function

for .

This is related to the Negative binomial distribution, which is the sum of i.i.d. geometric variables.

Properties

Let and

  1. Expectation:
  2. Variance:
  3. Moment-generating function:

\begin{align*} M_{X} : (-\infty, -\ln q) &\to \mathbb{R} : \t &\mapsto \frac{p}{1-q\mathrm{e}^t} \end{align*}

You can't use 'macro parameter character #' in math mode^P3 > [!check]- Proof of 1 > > Invoking the expansion for a [[geometric series]] > $$ > \begin{align*} > \Ex[X] &= \sum_{x=0}^\infty x\mathbb{P}(x=x) > = \sum_{x=0}^\infty x(1-p)^x p \\ > &= (1-p)p\sum_{x=0}^\infty x (1-p)^{x-1} \\ > &= (1-p) p\sum_{x=0}^\infty - \frac{d}{dp}\left[(1-p)^x\right] \\ > &= (p-1)p\, \frac{d}{dp} \sum_{x=0}^\infty(1-p)^x \\ > &= (p-1)p \,\frac{d}{dp} \frac{1}{1(1-p)} \\ > &= (p-1)p \,\frac{d}{dp}p^{-1} > = (1-p)p^{-1} > \end{align*} > $$ > as claimed, proving [[#^p1|^P1]]. > > Alternately we may invoke [[conditional expected value]]. > Let $S$ denote the event that the first trial is successful. > Then noting that $(X \mid S) \sim X + 1$, we have > $$ > \begin{align*} > \Ex[X] &= \Ex[X \mid S] p + \Ex[X \mid S^c]q \\ > &= \Ex[1+X]q = q(1+\Ex[X]) > \end{align*} > $$ > whence > $$ > \begin{align*} > \Ex[X] = \frac{q}{p} > \end{align*} > $$ > proving [[#^p1|^P1]]. <span class="QED"/> # --- #state/develop | #lang/en | #SemBr