|
|
|
|
LEADER |
01604 am a22001693u 4500 |
001 |
137931 |
042 |
|
|
|a dc
|
100 |
1 |
0 |
|a Duffy, Ken R.
|e author
|
700 |
1 |
0 |
|a Li, Jiange
|e author
|
700 |
1 |
0 |
|a Medard, Muriel
|e author
|
245 |
0 |
0 |
|a Guessing noise, not code-words
|
260 |
|
|
|b IEEE,
|c 2021-11-09T15:46:02Z.
|
856 |
|
|
|z Get fulltext
|u https://hdl.handle.net/1721.1/137931
|
520 |
|
|
|a © 2018 IEEE. We introduce a new algorithm for Maximum Likelihood (ML) decoding for channels with memory. The algorithm is based on the principle that the receiver rank orders noise sequences from most likely to least likely. Subtracting noise from the received signal in that order, the first instance that results in an element of the code-book is the ML decoding. In contrast to traditional approaches, this novel scheme has the desirable property that it becomes more efficient as the code-book rate increases. We establish that the algorithm is capacity achieving for randomly selected code-books. When the code-book rate is less than capacity, we identify asymptotic error exponents as the block length becomes large. When the code-book rate is beyond capacity, we identify asymptotic success exponents. We determine properties of the complexity of the scheme in terms of the number of computations the receiver must perform per block symbol. Worked examples are presented for binary memoryless and Markovian noise. These demonstrate that block-lengths that offer a good complexity-rate tradeoff are typically smaller than the reciprocal of the bit error rate.
|
546 |
|
|
|a en
|
655 |
7 |
|
|a Article
|
773 |
|
|
|t 10.1109/isit.2018.8437648
|