Abstract

In his world-famous paper of 1948, Shannon defined channel
capacity
as the ultimate rate at which information can be
transmitted over a communication channel with an error probability
that will vanish if we allow the blocklength to get infinitely
large. While this result is of tremendous theoretical importance, the
reality of practical systems looks quite different: no communication
system will tolerate an infinite delay caused by an extremely large
blocklength, nor can it deal with the computational complexity of
decoding such huge codewords. On the other hand, it is not necessary
to have an error probability that is exactly zero either, a small, but
finite value will suffice.

Therefore, the question arises what can be done in a practical
scheme. In particular, what is the maximal rate at which information
can be transmitted over a communication channel for a given fixed
maximum blocklength (i.e., a fixed maximum delay) if we allow a
certain maximal probability of error? In this project, we have started
to study these questions.

Block-codes with very short blocklength over the most general binary
hannel, the binary asymmetric channel (BAC), are
investigated. It is shown that for only two possible messages,
flip-flop codes are optimal, however, depending on the blocklength and
the channel parameters, not necessarily the linear flip-flop
code. Further it is shown that the optimal decoding rule is a
threshold rule. Some fundamental dependencies of the best code on the
channel are given.

Block-codes with a very small number of codewords are investigated for
the two special binary memoryless channels, the binary symmetric
channel (BSC)
and the Z-channel (ZC). The optimal (in the
sense of minimum average error probability, using maximum likelihood
decoding) code structure is derived for the cases of two, three, and
four codewords and an arbitrary blocklength. It is shown that for two
possible messages, on a BSC, the so-called flip codes of type t
are optimal for any t, while on a ZC, the flip code of type 0
is optimal. For codes with three or four messages it is shown that the
so-called weak flip codes of some given type are optimal where
the type depends on the blocklength. For all cases an algorithm is
presented that constructs an optimal code for blocklength n
recursively from an optimal code of length n-1. In the situation of
two and four messages, the optimal code is shown to be linear. For the
ZC a recursive optimal code design is conjectured in the case of five
possible messages.

The derivation of these optimal codes relies heavily on a new approach
of constructing and analyzing the code-matrix not row-wise
(codewords), but column-wise. Moreover, these results also
prove that the minimum Hamming distance might be the wrong design
criterion for optimal codes even for very symmetric channels like the
BSC.


Keywords

Channel capacity, binary asymmetric channel (BAC), error probability,
finite blocklengths, ML, optimal codes, Z-channel.



-||-   _|_ _|_     /    __|__   Stefan M. Moser
[-]     --__|__   /__\    /__   Senior Scientist, ETH Zurich, Switzerland
_|_     -- --|-    _     /  /   Adjunct Professor, National Yang Ming Chiao Tung University, Taiwan
/ \     []  \|    |_|   / \/    Web: https://moser-isi.ethz.ch/


Last modified: Tue Jan 31 17:24:31 UTC+8 2012