WOLFRAM|DEMONSTRATIONS PROJECT

Expectation Maximization for Gaussian Mixture Distributions

​
first distribution
μ1
8
σ1
5
second distribution
μ2
-12
σ2
3
probability
p
0.3
number of samples
100
number of iterationsof EM algorithm
3
random seed
326
μ
σ
0.763407
-12.2123
3.0401
0.236593
7.6106
5.90646
This Demonstration shows an implementation of the expectation-maximization algorithm for Gaussian mixture distributions. Given a sample dataset drawn from two Gaussians with probability
p
of the
th
i
data point being drawn from the first Gaussian distribution and probability 1-
p
of the
th
i
data point being drawn from the second Gaussian, the expectation-maximization algorithm iteratively estimates the maximum likelihood of the mean and variance of the two Gaussian distributions and the parameter
p
. The plot shows the random sample from the distributions and the final estimate of the distributions. The table shows the parameters of the distributions and the probability of drawing from that distribution on the left.