[PDF][PDF] Some information-theoretic aspects of uniquely decodable codes

VC da Rocha Jr - this volume, 2000 - researchgate.net
ABSTRACT A source code C is considered containing K codewords with lengths wi, 1≤ i≤
K, respectively, where K is a non negative integer. The capacity C of a source code is …

On unique decodability, McMillan's theorem and the expected length of codes

M Dalai, R Leonardi - Technical Report RT 200801-58, Department of …, 2008 - iris.unibs.it
In this paper we propose a revisitation of the topic of unique decodability and of some of the
related fundamental theorems. It is widely believed that, for any discrete source X, every …

On unique decodability

M Dalai, R Leonardi - IEEE transactions on information theory, 2008 - ieeexplore.ieee.org
In this paper, we propose a revisitation of the topic of unique decodability and of some
fundamental theorems of lossless coding. It is widely believed that, for any discrete source X …

New upper bounds on the rate of a code via the Delsarte-MacWilliams inequalities

R McEliece, E Rodemich, H Rumsey… - IEEE transactions on …, 1977 - ieeexplore.ieee.org
New upper bounds on the rate of a code via the Delsarte-MacWilliams inequalities Page 1
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. IT-%, NO. 2, MARCH 1977 157 …

Once more on the word-length of optimal source codes

G Longo, T Nemetz - Trans. of the Ninth Prague Conf. on …, 1983 - books.google.com
The source coding theorem assigns an operational meaning to the entropy, ie to the
expectation of the self-information. It states, roughly speaking, that the entropy of a discrete …

[PDF][PDF] On Variable to Fixed length codes, Source Coding and Rényi's Entropy

V Aggarwal, RK Bansal - B. Tech. Project Report, Department of …, 2005 - researchgate.net
Shannon information measure has a very concrete operational interpretation: it roughly
equals the minimum number of binary digits needed, on the average, to encode the …

How suboptimal is the Shannon code?

H Narimani, M Khosravifard… - IEEE transactions on …, 2012 - ieeexplore.ieee.org
In order to determine how suboptimal the Shannon code is, one should compare its
performance with that of the optimal code, ie, the corresponding Huffman code, in some …

On the average codeword length of optimal binary codes for extended sources (corresp.)

B Montgomery, B Kumar - IEEE transactions on information …, 1987 - ieeexplore.ieee.org
Although optimal binary source coding using symbol blocks of increasing length must
eventually yield a code having an average codeword length arbitrarily close to the source …

On one-to-one codes for memoryless cost channels

SA Savari - IEEE transactions on information theory, 2008 - ieeexplore.ieee.org
A new twist to the classical problem of compressing a discrete source over noiseless
channel alphabets where different code symbols may have different transmission costs is …

Development of two new mean codeword lengths

O Parkash, P Kakkar - Information Sciences, 2012 - Elsevier
Two new mean codeword lengths L (α, β) and L (β) are defined and it is shown that these
lengths satisfy desirable properties as a measure of typical codeword lengths. Consequently …