info prev up next book cdrom email home

Probability

Probability is the branch of mathematics which studies the possible outcomes of given events together with their relative likelihoods and distributions. In common usage, the word ``probability'' is used to mean the chance that a particular event (or set of events) will occur expressed on a linear scale from 0 (impossibility) to 1 (certainty), also expressed as a Percentage between 0 and 100%. The analysis of events governed by probability is called Statistics.


There are several competing interpretations of the actual ``meaning'' of probabilities. Frequentists view probability simply as a measure of the frequency of outcomes (the more conventional interpretation), while Bayesians treat probability more subjectively as a statistical procedure which endeavors to estimate parameters of an underlying distribution based on the observed distribution.


A properly normalized function which assigns a probability ``density'' to each possible outcome within some interval is called a Probability Function, and its cumulative value (integral for a continuous distribution or sum for a discrete distribution) is called a Distribution Function.


Probabilities are defined to obey certain assumptions, called the Probability Axioms. Let a Sample Space contain the Union ($\cup$) of all possible events $E_i$, so

\begin{displaymath}
S\equiv \left({\,\bigcup_{i=1}^N E_i}\right),
\end{displaymath} (1)

and let $E$ and $F$ denote subsets of $S$. Further, let $F'=\hbox{not-}F$ be the complement of $F$, so that
\begin{displaymath}
F\cup F'=S.
\end{displaymath} (2)

Then the set $E$ can be written as
\begin{displaymath}
E = E\cap S = E\cap(F\cup F') = (E\cap F)\cup (E\cap F'),
\end{displaymath} (3)

where $\cap$ denotes the intersection. Then
$\displaystyle P(E)$ $\textstyle =$ $\displaystyle P(E\cap F)+P(E\cap F')-P[(E\cap F)\cap (E\cap F')]$  
  $\textstyle =$ $\displaystyle P(E\cap F)+P(E\cap F')-P[(F\cap F')\cap (E\cap E)]$  
  $\textstyle =$ $\displaystyle P(E\cap F)+P(E\cap F')-P(\emptyset \cap E)$  
  $\textstyle =$ $\displaystyle P(E\cap F)+P(E\cap F')-P(\emptyset)$  
  $\textstyle =$ $\displaystyle P(E\cap F)+P(E\cap F'),$ (4)

where $\emptyset$ is the Empty Set.


Let $P(E\vert F)$ denote the Conditional Probability of $E$ given that $F$ has already occurred, then

$\displaystyle P(E)$ $\textstyle =$ $\displaystyle P(E\vert F)P(F)+P(E\vert F')P(F')$ (5)
  $\textstyle =$ $\displaystyle P(E\vert F)P(F)+P(E\vert F')[1-P(F)]$ (6)
$\displaystyle P(A\cap B)$ $\textstyle =$ $\displaystyle P(A)P(B\vert A)$ (7)
  $\textstyle =$ $\displaystyle P(B)P(A\vert B)$ (8)
$\displaystyle P(A'\cap B)$ $\textstyle =$ $\displaystyle P(A')P(B\vert A')$ (9)
$\displaystyle P(E\vert F)$ $\textstyle =$ $\displaystyle {P(E\cap F)\over P(F)}.$ (10)

A very important result states that
\begin{displaymath}
P(E\cup F)=P(E)+P(F)-P(E\cap F),
\end{displaymath} (11)

which can be generalized to
$P\left({\,\bigcup_{i=1}^n A_i\,}\right)= \sum_i P(A_i)-\setbox0=\hbox{$\scripts...
...kern\wd4\else\kern\dimen0\fi\fi
\mathop{{\sum}'}_{\kern-\wd4 ij} P(A_i\cup A_j)$
$ +\setbox0=\hbox{$\scriptstyle{ijk}$}\setbox2=\hbox{$\displaystyle{\sum}$}\setb...
... P(A_i\cap A_j\cap A_k)-\ldots+(-1)^{n-1}P\left({\,\bigcap_{i=1}^n A_i}\right).$

(12)

See also Bayes' Formula, Conditional Probability, Distribution, Distribution Function, Likelihood, Probability Axioms, Probability Function, Probability Inequality, Statistics



info prev up next book cdrom email home

© 1996-9 Eric W. Weisstein
1999-05-26