
In
coding theory
Coding theory is the study of the properties of codes and their respective fitness for specific applications. Codes are used for data compression, cryptography, error detection and correction, data transmission and computer data storage, data sto ...
and
information theory
Information theory is the mathematical study of the quantification (science), quantification, Data storage, storage, and telecommunications, communication of information. The field was established and formalized by Claude Shannon in the 1940s, ...
, a Z-channel or binary asymmetric channel is a
communications channel
A communication channel refers either to a physical transmission medium such as a wire, or to a logical connection over a multiplexed medium such as a radio channel in telecommunications and computer networking. A channel is used for inform ...
used to model the behaviour of some data storage systems.
Definition
A Z-channel is a channel with binary input and binary output, where each 0 bit is transmitted correctly, but each 1 bit has probability ''p'' of being transmitted incorrectly as a 0, and probability 1–''p'' of being transmitted correctly as a 1. In other words, if ''X'' and ''Y'' are the
random variable
A random variable (also called random quantity, aleatory variable, or stochastic variable) is a Mathematics, mathematical formalization of a quantity or object which depends on randomness, random events. The term 'random variable' in its mathema ...
s describing the probability distributions of the input and the output of the channel, respectively, then the crossovers of the channel are characterized by the
conditional probabilities:
:
Capacity
The
channel capacity
Channel capacity, in electrical engineering, computer science, and information theory, is the theoretical maximum rate at which information can be reliably transmitted over a communication channel.
Following the terms of the noisy-channel coding ...
of the Z-channel
with the crossover 1 → 0 probability ''p'', when the input random variable ''X'' is distributed according to the
Bernoulli distribution
In probability theory and statistics, the Bernoulli distribution, named after Swiss mathematician Jacob Bernoulli, is the discrete probability distribution of a random variable which takes the value 1 with probability p and the value 0 with pro ...
with probability ''
'' for the occurrence of 0, is given by the following equation:
:
where
for the
binary entropy function
Binary may refer to:
Science and technology Mathematics
* Binary number, a representation of numbers using only two values (0 and 1) for each digit
* Binary function, a function that takes two arguments
* Binary operation, a mathematical op ...
.
This capacity is obtained when the input variable ''X'' has
Bernoulli distribution
In probability theory and statistics, the Bernoulli distribution, named after Swiss mathematician Jacob Bernoulli, is the discrete probability distribution of a random variable which takes the value 1 with probability p and the value 0 with pro ...
with probability
of having value 0 and
of value 1, where:
:
For small ''p'', the capacity is approximated by
:
as compared to the capacity
of the
binary symmetric channel
A binary symmetric channel (or BSCp) is a common communications channel model used in coding theory and information theory. In this model, a transmitter wishes to send a bit (a zero or a one), and the receiver will receive a bit. The bit will be ...
with crossover probability ''p''.
:
For any
,
(i.e. more 0s should be transmitted than 1s) because transmitting a 1 introduces noise. As
, the limiting value of
is
.
Bounds on the size of an asymmetric-error-correcting code
Define the following distance function
on the words
of length ''n'' transmitted via a Z-channel
:
Define the sphere
of radius ''t'' around a word
of length ''n'' as the set of all the words at distance ''t'' or less from
, in other words,
:
A
code
In communications and information processing, code is a system of rules to convert information—such as a letter, word, sound, image, or gesture—into another form, sometimes shortened or secret, for communication through a communicati ...
of length ''n'' is said to be ''t''-asymmetric-error-correcting if for any two codewords
, one has
. Denote by
the maximum number of codewords in a ''t''-asymmetric-error-correcting code of length ''n''.
The Varshamov bound.
For ''n''≥1 and ''t''≥1,
:
The constant-weight code bound.
For ''n > 2t ≥ 2'', let the sequence ''B
0, B
1, ..., B
n-2t-1'' be defined as
:
for
.
Then
Notes
References
*
*
*
* {{cite conference, last1=Tallini, first1=L.G. , last2=Al-Bassam , first2=S. , last3=Bose , first3=B. , title=On the capacity and codes for the Z-channel , conference=Proceedings of the IEEE International Symposium on Information Theory , location=Lausanne, Switzerland , year=2002 , page=422
Coding theory
Information theory
Inequalities (mathematics)