A Limit Theorem from Information Theory
A Limit Theorem from Information Theory
How many distinct sequences can be created by rearranging binary symbols? If the binary symbols are, say, and , and if the fraction of 's is denoted , then the number of possible sequences is
. When is zero or one there is only one possible sequence for any value of , but when , the number of possible sequences increases exponentially with . The logarithm of the number of possible sequences, expressed on a per symbol basis, is , and the limit f(,)=-()-(1-)(1-) can be interpreted as the average number of bits needed per symbol to describe a long binary sequence with symbol probabilities and . This Demonstration shows
for with the limit .
○
=/
×
|
|
0<<1
f(,)≡
log
2
|
|
lim
∞
log
2
log
2
1-
log
2
|
|
0.1≤≤100
-p()-(1-)(1-)
log
2
log
2