|
The Comparison between Arbitrary Information Sources and Nonhomogeneous Markov Information Sources and the Small Deviations Theorems
Wen Liu(1),WeiGuo YANG(2)
Acta Mathematica Sinica, Chinese Series
1997, 40 (1):
22-36.
DOI: 10.12386/A1997sxxb0004
Let {Xn, n≥0} be a sequence of measurable functions taking their values in the alphabet S = {1,2,…, N}. Let P,Q be two probability measures on the measurable space, such that {Xn,n ≥ 0} is Markovian under Q, Let h(P \ Q) = limsupn-1 log[P(X0,…, Xn)/Q(X0,…, Xn)} be the sample divergence-rate distance
n→∞
of P relative to Q. In this paper, a class of small deviations theorems for the averages of the functions of two variables of an arbitrary information sources are discussed by using the concept h(P \ Q), and, as a corollary, a small deviations theorem for the entropy densities of arbitrary information sources is obtained. Finally, an extension of Shannon-McMillan Theorem on the case of nonhomogeneous Markov information sources is given.
Related Articles |
Metrics |
Comments(0)
|
|