2Axioms of probability

IA Probability



2.4 Important discrete distributions
We’re now going to quickly go through a few important discrete probability
distributions. By discrete we mean the sample space is countable. The sample
space is = {ω
1
, ω
2
, ···} and p
i
= P({ω
i
}).
Definition (Bernoulli distribution). Suppose we toss a coin. =
{H, T }
and
p [0, 1]. The Bernoulli distribution, denoted B(1, p) has
P(H) = p; P(T ) = 1 p.
Definition (Binomial distribution). Suppose we toss a coin
n
times, each with
probability p of getting heads. Then
P(HHT T ···T ) = pp(1 p) ···(1 p).
So
P(two heads) =
n
2
p
2
(1 p)
n2
.
In general,
P(k heads) =
n
k
p
k
(1 p)
nk
.
We call this the binomial distribution and write it as B(n, p).
Definition (Geometric distribution). Suppose we toss a coin with probability
p
of getting heads. The probability of having a head after k consecutive tails is
p
k
= (1 p)
k
p
This is geometric distribution. We say it is memoryless because how many tails
we’ve got in the past does not give us any information to how long I’ll have to
wait until I get a head.
Definition (Hypergeometric distribution). Suppose we have an urn with
n
1
red
balls and
n
2
black balls. We choose
n
balls. The probability that there are
k
red balls is
P(k red) =
n
1
k

n
2
nk
n
1
+n
2
n
.
Definition (Poisson distribution). The Poisson distribution denoted P (λ) is
p
k
=
λ
k
k!
e
λ
for k N.
What is this weird distribution? It is a distribution used to model rare events.
Suppose that an event happens at a rate of
λ
. We can think of this as there
being a lot of trials, say
n
of them, and each has a probability
λ/n
of succeeding.
As we take the limit n , we obtain the Poisson distribution.
Theorem (Poisson approximation to binomial). Suppose
n
and
p
0
such that np = λ. Then
q
k
=
n
k
p
k
(1 p)
nk
λ
k
k!
e
λ
.
Proof.
q
k
=
n
k
p
k
(1 p)
nk
=
1
k!
n(n 1) ···(n k + 1)
n
k
(np)
k
1
np
n
nk
1
k!
λ
k
e
λ
since (1 a/n)
n
e
a
.