Some definitions:
assumptions about transmission channel:
properties of binary channel:
information rate of code (length n) = $\frac{1}{n} \log_{2} |C|$
Let $\phi_{p} (v,w)$ be probability that if word v sent over BSC with reliability p, word w is received.
$\phi_{p} (v, w) = p^{n-d} (1-p)^d$ if v and w disagree in d positions.
if v₁ and w disagree in d₁, and v₂ and w in d₂, then $\phi_{p} (v_{1}, w) \leq \phi_{p} (v_{2}, w)$ iff d₁ ≥ d₂.
K = {0, 1}, $K^{n}$ = set of all binary words of length n
(Hamming) weight: number of ones
(Hamming) distance: number of differing digits between words
Complete:
Incomplete:
Reliability: probability that if v sent over BSC of prob b, then IMLD concludes that v was sent
$\theta_{p} (c, v) = \sum_{w \in L(v)} \phi_{p} (v, w)$ where $L(v) = \lbrace words \in K^{n} \enspace | \enspace \text{IMLD correctly concludes v sent} \rbrace$
error pattern is u = v + w if v ∈ C sent and w ∈ Kⁿ received.
error detected if u is not a codeword. error patterns that can’t be detected are sums of codewords
distance of code: smallest d(v, w) ∀v,w.
Code of dist d at least detects patterns of weight d-1, there’s at least one pattern of weight d not detected.
t-error-detecting code if detects pattern weight max t, and does not detect at least one pattern of weight t+1. so code with dist d is “d-1 error-detecting code”
Code C corrects error pattern u if ∀v ∈ C, v+u closer to v than any other word
Code of dist d corrects all error patterns $weight \leq \frac{d-1}{2}$, at least one pat weight $1+\frac{d-1}{2}$ not corrected.