#jsDisabledContent { display:none; } My Account | Register | Help

# Entropy rate

Article Id: WHEBN0011071463
Reproduction Date:

 Title: Entropy rate Author: World Heritage Encyclopedia Language: English Subject: Collection: Publisher: World Heritage Encyclopedia Publication Date:

### Entropy rate

In the mathematical theory of probability, the entropy rate or source information rate of a stochastic process is, informally, the time density of the average information in a stochastic process. For stochastic processes with a countable index, the entropy rate H(X) is the limit of the joint entropy of n members of the process Xk divided by n, as n tends to infinity:

H(X) = \lim_{n \to \infty} \frac{1}{n} H(X_1, X_2, \dots X_n)

when the limit exists. An alternative, related quantity is:

H'(X) = \lim_{n \to \infty} H(X_n|X_{n-1}, X_{n-2}, \dots X_1)

For strongly stationary stochastic processes, H(X) = H'(X). The entropy rate can be thought of as a general property of stochastic sources; this is the asymptotic equipartition property.

## Entropy rates for Markov chains

Since a stochastic process defined by a Markov chain that is irreducible and aperiodic has a stationary distribution, the entropy rate is independent of the initial distribution.

For example, for such a Markov chain Yk defined on a countable number of states, given the transition matrix Pij, H(Y) is given by:

\displaystyle H(Y) = - \sum_{ij} \mu_i P_{ij} \log P_{ij}

where μi is the stationary distribution of the chain.

A simple consequence of this definition is that an i.i.d. stochastic process has an entropy rate that is the same as the entropy of any individual member of the process.