On the capacity of Markov sources over noisy channels

A Kavcic - GLOBECOM'01. IEEE Global Telecommunications …, 2001 - ieeexplore.ieee.org
GLOBECOM'01. IEEE Global Telecommunications Conference (Cat. No …, 2001ieeexplore.ieee.org
We present an expectation-maximization method for optimizing Markov process transition
probabilities to increase the mutual information rate achievable when the Markov process is
transmitted over a noisy finite-state machine channel. The method provides a tight lower
bound on the achievable information rate of a Markov process over a noisy channel and it is
conjectured that it actually maximizes this information rate. The latter statement is supported
by empirical evidence (not shown in this paper) obtained through brute-force optimization …
We present an expectation-maximization method for optimizing Markov process transition probabilities to increase the mutual information rate achievable when the Markov process is transmitted over a noisy finite-state machine channel. The method provides a tight lower bound on the achievable information rate of a Markov process over a noisy channel and it is conjectured that it actually maximizes this information rate. The latter statement is supported by empirical evidence (not shown in this paper) obtained through brute-force optimization methods on low-order Markov processes. The proposed expectation-maximization procedure can be used to find tight lower bounds on the capacities of finite-state machine channels (say, partial response channels) or the noisy capacities of constrained (say, run-length limited) sequences, with the bounds becoming arbitrarily tight as the memory-length of the input Markov process approaches infinity. The method links the Arimoto-Blahut algorithm to Shannon's noise-free entropy maximization by introducing the noisy adjacency matrix.
ieeexplore.ieee.org
以上显示的是最相近的搜索结果。 查看全部搜索结果