Error Correction

  • November 2019
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Error Correction as PDF for free.

More details

  • Words: 10,723
  • Pages: 155
Training material of NHK-CTI

Error Detection and Error Correction Training material of NHK-CTI

2007.7.24 NHK Communications Training Institute Hideo Tsuji [email protected] 1

Training material of NHK-CTI

Contents Training material of NHK-CTI

Conception of error correction Fundamental of error correction Various error correction codes Applications in broadcasting field

2

Training material of NHK-CTI

Concept of error correction

Training material of NHK-CTI

3

Training material of NHK-CTI

Program production to transmission for broadcasting

Training material of NHK-CTI

Recording

Transmission Editing

Studio

Location

Reception

Field pick up

live 4

Training material of NHK-CTI

Error correction on channel Training material of NHK-CTI

Digital data

Information (Video/audio/ data etc.

Data reduction

Source coding

Convert to suitable signal for transmission

Channel coding

Noise (Error) Modulation Channel (Satellite, Terrestrial)

Received information

Source decoding

Channel decoding

Demodulation

5

Training material of NHK-CTI

Example of error in digital signal Training material of NHK-CTI

Original signal

0

1

0

0

1

0

1

0

Small noise

Threshold level 0

1

0

0

1

0

1

0 Threshold level

Large noise 0

1

0

1

1

0

1

0

Advantage of digitalization Only the distinction between 0 and 1, Strong against noise → The high quality

Error correction technology 6

Training material of NHK-CTI

Error factor generated on the channel Training material of NHK-CTI

Random error  Irregularly and disorderly generated in the transmission bit sequence  Caused by a thermal noise and a wave distortion. Burst error  Concentrate partially and occur continuously  Caused by loss of synchronization, impulse noise and persistent attenuation like fading, etc. Quality  Evaluated by BER (Bit Error Rate)  Ex.10-3 , 10-6 7

Training material of NHK-CTI

Error control method in communication system Training material of NHK-CTI

Reducing method of noise influence Boost up power Waste of energy Cause probability of interference

Add redundancy to the transmission data according to the rule and transmit Roughly there are two methods to control error on channel  ARQ (Automatic Repeat Request)  Issue the repeat request when the transmission error is detected on the reception side.

 FEC (Forward Error Correction)  Correct the error on the reception side while decoding 8

Training material of NHK-CTI

ARQ & FEC Training material of NHK-CTI

ARQ (Automatic Repeat Request) If a receiver detects an error, the receiver requests retransmission to the transmitter Difficulty in dealing with a lot of receivers Used by the packet type communication including TCP/IP

FEC: Forward Error Correction If a receiver detects an error, the receiver corrects using redundant data Used by broadcasting, mobile communication and storage media (DVD/CD-ROM)

9

Training material of NHK-CTI

Sequence of ARQ & FEC Training material of NHK-CTI

ARQ TX

data-1

data-2 ACK

RX

data-2 NAK

data-3 ACK

data-1

data-2

data-2

data-3

No error

Error detect

No error

No error

Time

FEC TX

RX

data-1

data-2

data-3

data-1

data-2

data-3

Error correction

Error correction

Error correction

Time

10

Training material of NHK-CTI

Feature of ARQ & FEC Training material of NHK-CTI

ARQ

Advantage

FEC

High reliability

Return circuit is not needed

Strong against unexpected error

Easy control as the system

Simple decoder

Small delay

Return circuit is needed

Difficult to deal with a very long burst error

Disadvantage Sometime large delay Necessary to repeat sending under the bad channel

Large redundancy Complex decoder 11

Training material of NHK-CTI

Error correction scheme Training material of NHK-CTI

Principle of error correction Error correction can be performed adding redundancy to the information transmitted

Encoder Add redundancy (parity check bits) under some rule

Decoder Detect and correct errors using parity check bits

12

Training material of NHK-CTI

What is error detection and error correction?

Training material of NHK-CTI

Error may occur during transmission impossible decoding the data External Noise

Coding

A

0100

Decoding 010?

?

Add redundant data before transmission to detect/correct error when decoding Error corrective coding

A

0100

0100 010

Error Correction 010? 010

0100

A

Decoding

13

Training material of NHK-CTI

Parity check Training material of NHK-CTI

Add a parity bit to information data according to the following rule if No. of “1” in the data is odd, add a parity bit “1” if No. of “1” in the data is even, add a parity bit “0”

Able to detect 1 bit error Information sequence

Transmitted sequence

Received sequence

10010001

10010001 1

100110011

!

00101110

00101110 0

101011000

? 14

Training material of NHK-CTI

Example of parity check code (ASCII) Training material of NHK-CTI

ASCII stands for American Standard Code for Information Interchange Comprised of 7 digits (0 to 127)d Add 1 bit for parity check comprising 8 bits Character “A” is expressed by (100 0001) assii for even parity (0100 0001) for odd parity (1100 0001)

Receiver side check the number of “1” every 8 bit 15

Training material of NHK-CTI

ASCII code Training material of NHK-CTI

Character A B C D E F

Binary Decimal Hexa-decimal 100 0001 65 41 100 0010 66 42 100 0011 67 43 100 0100 68 44 100 0101 69 45 100 0110 70 46

Character a b c d e f

Binary Decimal Hexa-decimal 110 0001 97 61 110 0010 98 62 110 0011 99 63 110 0100 100 64 110 0101 101 65 110 0110 102 66 16

Training material of NHK-CTI

Example of checksum (ISBN) Training material of NHK-CTI

ISBN (International Standard Book Number) The last digit is a check digit

Calculation (modulus 11 with weights 10-2)  4-87966-931-?  4x10+8x9+7x8+9x7+6x6+6x5+9x4+3x3+1x2 = 40 +72 +56 +63 +36 +30 +36 +9 +2 =344  344 ⁄ 11 = 31 … 3  11-3 = 8  check digit is 8 17

Training material of NHK-CTI

Example of error detective code Training material of NHK-CTI

 Is “0-440-21861-5” a correct ISBN code?  0x10+4x9+4x8+0x7+2x6+1x5+8x4+6x3+1x2 = 0 +36 +32 +0 +12 +5 +32 +18 +2 =137  137 ⁄ 11 = 12 … 5  11-5 = 6  calculated check digit is 6  this code is wrong!

18

Training material of NHK-CTI

Example of errors (VTR) Training material of NHK-CTI

Expand a portion of image

Error correction

Black and white spots are errors (In case of pixel-wise coding ) 19

Training material of NHK-CTI

Example of errors (broadcasting) Training material of NHK-CTI

Expand a portion of image

Error correction Error is observed like a rectangular mass (block) (In case of MPEG-2 TS Video coding) 20

Training material of NHK-CTI

Example of errors Training material of NHK-CTI

Apart Spatially / Temporally

Original message

“This is a pen”

Received message

Transmission/ Recoding

≠ Error

“This is b pen”

1011101011

1001101011 In digital, 0→1,1→0

21

Training material of NHK-CTI

Error detection (example of sentence) Training material of NHK-CTI

Error Original message “This is a pen”

Received message “This is a pbn”

Recognized message ?

No word Error detect

Can not correct

Candidate of pbn is pen, pan, pun,pin etc. Impossible to correct, because there are several candidates similar to pbn. 22

Training material of NHK-CTI

Error correction (example of sentence) Training material of NHK-CTI

Error Original message “This is a pen”

Received message “This is b pen”

Recognized message “This is a pen”

Grammar error Error detect

Correct

We can correct the sentence, because if you know the grammar. 23

Training material of NHK-CTI

Error in digital transmission Training material of NHK-CTI

Error Original sequence

Received sequence

1001101011

1011101011

Can not detect error

Recognized sequence ??

Can not correct

It is unknown whether this is right or not, because we have no rule to check it. 24

Training material of NHK-CTI

Example of creating codewords (digital transmission)

Training material of NHK-CTI

Rule: Add one bit to the original message, and make the number of “1” even number.

Add Original [Number of “1”] bit sequence 0101 1011

[2(even)] [3(odd)]

Codeword [number of “1”] sequence 01010 [2(even)]

0 +

= 1

10111

[4(even)]

25

Training material of NHK-CTI

Error detection (digital transmission) Training material of NHK-CTI

Error Original sequence

Codeword sequence

0101

01010

Received Recognized sequence sequence [number of “1”] 01110 [3(odd)] ?

1011

10111

10101 [3(odd)]

even number of “1”

Error detect

?

Can not correct

26

Training material of NHK-CTI

Example of creating codewords (digital transmission)

Training material of NHK-CTI

Rule: Add the same two bits to the original one bit Original bit

0 1

+

Add bits

Codewords sequence

00 11

000 111

=

27

Training material of NHK-CTI

Error correction (digital transmission) Training material of NHK-CTI

Error

Recognized sequence

Original bit

Codeword sequence

Received sequence

0 1 0

000 111 000

001 110 100

000 111 000

0 1 0

1

111

101

111

1

1

111

011

111

1

Invalid codeword

000 or 111

Error detect

find the nearest codeword

Correct 28

Training material of NHK-CTI

Example of error correction Training material of NHK-CTI

1.Decide all H, as the sum of row will be even number 2.Decide all V, as the sum of column will be even number 3. Decide P, as the sum of H, V, and P will be even number

X11 X12 X13

H1

X21 X22 X23

H2

X31 X32 X33

H3

V1 V2 V3

P

0 1 1

1 1 0

0 0 1

1 0 0

0 1 1 1 1 0 1 0 1

1 0 0

0

0

1

0

0 0 1

0

← Odd number

↑ Odd number 29

Training material of NHK-CTI

Key point Training material of NHK-CTI

For error correction or detection, Create codeword from information bits (original message) under some rule.

30

Training material of NHK-CTI

Is error correcting code almighty?

Training material of NHK-CTI

31

Training material of NHK-CTI

Misscorrection (example of sentence) Training material of NHK-CTI

Error Original message “This is a pen”

Received message “This is a pbn”

Recognized message “This is a pun”

No word Error detect

Try to correct

Candidate of pbn is pen, pan, pun,pin etc. Force to correct the nearest word 32

Training material of NHK-CTI

Error detection (digital transmission) Training material of NHK-CTI

Original bit

Codeword sequence

0 1 0

000 111 000

2 bits 1 bit

1 1

Recognized sequence

Received sequence

1 bit

101 Misscorrection 111 110 111 100 000 Correct

1 1 0

111

1 bit

101

properly

111

1

111

1 bit

011

111

1

find the nearest codeword Invalid codeword

000 or 111

Error detect 33

Training material of NHK-CTI

Undetected error (example of sentence) Training material of NHK-CTI

Error Original message “This is a pen”

Received message

Recognized message

“This is a pun”

“This is a pun”

The grammar is right

34

Training material of NHK-CTI

Undetected error (digital transmission) Training material of NHK-CTI

Error Original sequence

Codeword sequence

Received sequence

Recognized sequence

0101

01010

[number of “1”] 11110 [4(even)]

1011

10111

10100 [2(even)]

1111 1010

Undetected error even number of “1” number of “1” is even. These are codewords. 35

Training material of NHK-CTI

Key point Training material of NHK-CTI

Number of correctable errors is finite. If the number of errors exceeds the error correction capability,  Misscorrection  Undetected error might occur

36

Training material of NHK-CTI

Fundamental of error correction

Training material of NHK-CTI

37

Training material of NHK-CTI

Example of codewords and Hamming distance Training material of NHK-CTI

Codewords by even parity Information [Number of “1”] 00 01 10 11

[0(even)] [1(odd)] [1(odd)] [2(even)]

Add bit

+

0 1 1 0

Codeword [number of “1”] =

000 011 101 110

[0(even)] [2(even)] [2 (even) ] [2 (even) ]

Hamming distance: dH Number of different bits between the two corresponding codewords. Codeword 1 0 1 dH=2 Codeword 1 1 0

38

Training material of NHK-CTI

Example of codewords and Hamming distance Training material of NHK-CTI

Minimum Hamming distance Minimum value of Hamming distance of all codewords. Codewords All the codewords are 2bit or more different.

000 011 101 110

Minimum Hamming distance =2

It doesn't become a codeword by one bit error. One bit error can be detected. 39

Training material of NHK-CTI

Geometrical allocation of codewords Training material of NHK-CTI

Z (001)

Codeword

000 011 101 110

(101)

(011)

(111)

Codeword Invalid codeword dH:Hamming distance

(010) Y

(000) (100) X

dH=1

(110)

In case of 1 bad bit →Code of one Hamming distance =Invalid codeword In case of 2 bad bit →Code of two Hamming distance =Another codeword 40

Training material of NHK-CTI

Image of code space Training material of NHK-CTI

Code space (finite)

In case of Hamming code (7,4) all number of word:128=(27), codeword:16(=24), Invalid codeword: 112=(128-16)

Invalid codeword

Codeword

Minimum Hamming distance dmin: Smallest Hamming distance between any pairs of code vectors in the code Determine error detecting or error correcting capability 41

Training material of NHK-CTI

Error detection and minimum Hamming distance Training material of NHK-CTI

Hamming code (7,4) 0000 000 0001 011 0010 110 0011 101 0100 111 0101 100 0110 001 0111 010

1000 101 1001 110 1010 011 1011 000 1100 010 1101 001 1110 100 1111 111

minimum Hamming distance=3 Can not become codeword within 2bit errors

Possible 2bit error detection

Information, Parity

42

Training material of NHK-CTI

Error detection and minimum Hamming distance Training material of NHK-CTI

Cn=(x1,x2,・・,xN) C3

dmin≧s+1 s: Number of error detecting capability

s C1

dmin

Error detection area Possible to detect less than or equal s bits

1 C2

Codeword Invalid codeword

43

Training material of NHK-CTI

Error correction and minimum Hamming distance Training material of NHK-CTI

Hamming code (7,4) 0000 000 0001 011 0010 110 0011 101 0100 111 0101 100 0110 001 0111 010

1000 101 1001 110 1010 011 1011 000 1100 010 1101 001 1110 100 1111 111

minimum Hamming distance=3 The nearest codeword becomes the codeword as an original codeword in single bit error

Possible single error correction

Information, Parity

44

Training material of NHK-CTI

Error correction and minimum Hamming distance Training material of NHK-CTI

Cn=(x1,x2,・・,xN) dmin≧2t+1

C3

t: Number of error correcting capability Error correction area Possible to correct less than or equal t bits

t

1

C1

t C2

dmin

Codeword Invalid codeword

45

Training material of NHK-CTI

Various error correction codes

Training material of NHK-CTI

46

Training material of NHK-CTI

Classification of error correction code Training material of NHK-CTI

Hamming code

Random error correction code

BCH code Golay code Difference-set cyclic code

Block code Burst error correction code

Reed-Solomn code Fire code Wyner-Ash code

Random error correction code

Sequential decode

Convolutional code Burst error correction code

Concatenated code 117

Punctured code / Viterbi decode

Hangelberger code Iwatare code Turbo code/ Soft decision decode Product code

47

Training material of NHK-CTI

Type of error correction code Block code

Convolutional code

Information

Codeword Coding is done independently from other blocks Information message is divided into the constant length Able to be a system based on the algebra When t is assumed to be a capable number of error corrections, the block code is often written as (n,k,t).

The information coded in the past influences on the current coding The error correction capability is higher than the block coding Viterbi decoding is used for decoding 48

Training material of NHK-CTI

Block code & Convolutional code Block codeword c(n,k)

n k

Key word; code rate: k/n

n-k

Information bit

Check bit

Function

Convolutinal coder Key word; code rate:1/2, constraint length:7 D

Delay



Modulo-2 addition

Xoutput



G1=171oct Input

D

D

D

D

+ 113

D

D G2=133oct

Youtput

49

Training material of NHK-CTI

Expression of polynomial Training material of NHK-CTI

 The coefficient of nn-th power of x in polynomial expression shows the corresponding bit position in the sequence Code comprising of n bit sequence c=(c1, c2, c3, ・・・・・, cn) is expressed by following polynomial n-1

c(x) = c1x + c2 x

n-2

+ ⋅⋅⋅⋅⋅ + cn-1x + cn

Example c=(1,1,0,0,0,1) 5

4

c(x) = x + x + 1

50

Training material of NHK-CTI

Operation for code processing Training material of NHK-CTI

Operation 0,1,,,,Xn( n=0,1,2,3,….) Calculation of 0 and 1 (Addition) 0+0=0, 0+1=1, 1+0=1,1+1=0

modulo2,Exclusive OR (Multiplication) 0×0=0, 0×1=0, 1×0=0, 1×1=1

Operation including X (Addition ) 0+X=X, 1+X=1+X, X+X=0 (Multiplication) 0×X=0, 1×X =X, X×X = X2

51

Training material of NHK-CTI

Encoding single error detection Training material of NHK-CTI

Information bit sequence

i = (i 1 , i 2 ,⋅ ⋅ ⋅ ⋅ ⋅, i k )

Codeword sequence

c = (c 1 , c 2 ,⋅ ⋅ ⋅ ⋅ ⋅, c k , c k + 1 ) Redundant bit

Coding Parity check bit

c1 = i1 c2 = i2 ⋅ ck = ik ck + 1 = c1 + c2 + ⋅⋅⋅⋅⋅ + ck

Parity check equation c 1 + c 2 + ⋅ ⋅ ⋅ ⋅ ⋅ + c k + c k

+ 1

= 0

Parity check equation is necessary and sufficient condition to be a codeword 52

Training material of NHK-CTI

Hamming code Training material of NHK-CTI

53

Training material of NHK-CTI

Hamming code (7,4) Training material of NHK-CTI

Information bit sequence Codeword sequence

Coding

Parity check bit

i = (i 1 , i 2 , i 3 , i 4 ) c = (c 1, c 2, c 3, c 4, c 5, c 6, c 7)

c1 c2 c3 c4 c5 c6 c7

= = = = = = =

i1 i2 i3 i4 c 1 + c 2 + c3 c2 + c3 + c4 c1 + c2 + c4

54

Training material of NHK-CTI

How to find parity check bits? Training material of NHK-CTI

C1 1011

C1 C1

C1+C3

C5

C6

C2 0 C2

C3 C1 C1+C3

C2

0 C2 C2 C1+C3 C1+C2+C4 C2+C5 C1+C3

C4 C1 C1+C4

C2

0

C1+C2+C4 C7

C5

C1+C3

C6 C1+C3

C1+C2+C4 C1+C2+C3+C5 C1+C3+C6 C1+C2+C4

0

C7

C1+C2+C4 C1+C2+C4

C1+C2+C3+C5 C2+C3+C4+C6

C1+C2+C4+C7

The codeword is divisible by the generator bit sequence, therefore C1+C2+C3+C5 =0

C2+C3+C4+C6 =0

C1+C2 +C4+C7 =0

Therefore parity check bit becomes as follows C5 = C1+C2+C3 C6 = C2+C3+C4

C7 = C1+C2 +C4 55

Training material of NHK-CTI

Parity check equation Training material of NHK-CTI

Transform the parity check equation The necessary and sufficient condition to be a codeword is “parity check equation =0” Hamming (7,4) parity check equation

c1 + c2 + c3

+ c5

c2 + c3 + c4 c1 + c2 + c4

=0

+ c6 = 0 + c7 = 0 56

Training material of NHK-CTI

Channel model Training material of NHK-CTI

Noise source Transmitted sequence

Error pattern e(0,0,1,0,0) Received sequence r=c+e c(1,0,0,1,1)

c(1,0,1,1,1)

Received sequence r = c + e

= (c1 + e1, c2 + e2, ⋅ ⋅ ⋅ ⋅ ⋅ , ck + 1 + ek + 1) = (r1, r2, ⋅ ⋅ ⋅ ⋅ ⋅ , rk + 1) 57

Training material of NHK-CTI

Syndrome of Hamming code (7,4) Training material of NHK-CTI

Syndrome s: Syndrome (s) is the substitution of the element in received sequence (r) for the parity check equation

s1 = = = s2 = = = s3 = = =

r1 + r2 + r3 + c1 + e1 + c2 + e2 + e1 + e2 + e3 + r2 + r3 + r4 c2 + e2 + c3 + e3 + e2 + e3 + e4 + r4 r1 + r2 c1 + e1 + c2 + e2 + e1 + e2 + e4

r5 c3 + e3 + c5 + e5 e5 + r6 c4 + e4 + c6 + e6 + e6 + r7 c4 + e4 + c7 + e7 + e7 58

Training material of NHK-CTI

Syndrome for single error Training material of NHK-CTI

Error pattern

Syndrome

e1

e2

e3

e4

e5

e6

e7

s1

s2

s3

1

0

0

0

0

0

0

1

0

1

0

1

0

0

0

0

0

1

1

1

0

0

1

0

0

0

0

1

1

0

0

0

0

1

0

0

0

0

1

1

0

0

0

0

1

0

0

1

0

0

0

0

0

0

0

1

0

0

1

0

0

0

0

0

0

0

1

0

0

1

0

0

0

0

0

0

0

0

0

0

Error patterns are substituted into the previous page syndrome S1, S2, S3, then obtain Syndrome pattern 59

Training material of NHK-CTI

Example of Hamming code (7,4) Information code a1 1

a2 0

a3 1

a4 1

Parity code p1

p2

0

0

p3 0

p1 = a 1 ⊕ a 2 ⊕ a 3 p2 =

a2 ⊕ a3 ⊕ a4

p 3 = a1 ⊕ a 2

⊕ a4

Transmission code Degradation through channel Reception code a1

a2

a3

a4

p1

p2

p3

1

0

0

1

0

0

0

Syndrome

S1 S2 S3

Error position

1 0 1

a1

1 1 1

a2

1 1 0

a3

0 1 1

a4

1 0 0

P1

Error detect

s1

s2

s3

0 1 0

P2

Error

1

1

0

0 0 1

P3

correct

0 0 0

None

(0 1)

+

q1 = a1 ⊕ a 2 ⊕ a 3 q2 =

+

a 2 ⊕ a3 ⊕ a4

q 3 = a1 ⊕ a 2

+

⊕ a4 1

1

0

q1

q2

q3

Detection of bit error position by the Syndrome value 60

Training material of NHK-CTI

Example of Hamming coder Training material of NHK-CTI

C4

c3 c2 c1

c4 c3 c2 c1 c7 c6 c5

61

Training material of NHK-CTI

Example of Hamming decoder Training material of NHK-CTI

c4 c3 c2 c1

c4 c3 c2 c1

c7 c6 c5 62

Training material of NHK-CTI

Important point (CRC) Training material of NHK-CTI

As an example of the rule, if the information polynomial is divided by some signal (generator polynomial) and the remainder is added to the information, information can be divisible. CRC (Cyclic Redundancy Check)  divisible → No error  indivisible → Error

63

Training material of NHK-CTI

Example of CRC Training material of NHK-CTI

Principal Information desired to transmit data is 5,678 As a rule, decide divisor as 98 beforehand 2 digits redundant data is added 5,678 567,800 Divide 567,800 by 98, remainder becomes 86 Add (98-86=12) to the 567,800 567,800+12=567,812 is divisible by 98 Transmitting information is 567,812 In receiver side, divide the received data by 98, if there is no remainder, no error, it is true data, discard 2bit from LSB, ...........otherwise erroneous data 64

Training material of NHK-CTI

Coding and decoding by generator polynomial Training material of NHK-CTI

Encoding Encode codeword polynomial c(x) which is divisible by generator polynomial g(x) n−k

c( x) = x i( x) + p( x) = q( x) g ( x) Decoding Divide received polynomial r(x) by g(x), which contains error polynomial e(x),

s ( x ) = r ( x ) mod g ( x ) = { c ( x ) + e( x ) }mod g ( x ) = e(x) mod g(x) From syndrome polynomial s(x), find the syndrome pattern and correct the reception message and get the right codeword 65

Training material of NHK-CTI

How to create CRC Training material of NHK-CTI

i(x): Information polynomial i (i1,12,,,,ik:kbits) f(x): Shift i(x) to the left by (n-k) bits for CRC (nk)bits, f(x)=xn-ki(x) If f(x) is divide by generator polynomial g(x), then the quotient is q(x) and remainder is p(x), therefore f(x) is expressed f(x)=q(x)g(x)+p(x) Add both side p(x) to examine the codeword polynomial c(x)=f(x)+p(x)  Left side f(x)+p(x)= xn-ki(x)+p(x)  Right side q(x)g(x)+p(x)+p(x)=q(x)g(x)  Therefore codeword polynomial c(x)=xn-ki(x)+p(x) is divisible by g(x)  Codeword c (i1,12,,,,ik,,p1p2,,,pn-k:nbits) 66

Training material of NHK-CTI

Example of CRC Training material of NHK-CTI

Information code i=(1,0,1,0) Polynomial expression of information code  i(x)=x3+x

Shift i(x) by (n-k:3) bit to the left  f(x)=x3i(x)=x6+x4

Divide f(x) by generator polynomial g(x)=x3+x+1 Find the remainder p(x):CRC polynomial f ( x) x6 + x4 = 3 = x3 + 1 g ( x) x + x + 1

remainder p(x)=x+1

Therefore c(x)=f(x)+p(x)=x6+x4+x+1  Convert to information code c=(1,0,1,0,0,1,1) 67

Training material of NHK-CTI

CRC generation circuit Training material of NHK-CTI

1 g0 Input: f(x)

x2

x g1

g2

Xn

Xn-1 gn-1

Output: q(x) D0

gi

D1

coefficient

Dn

Di

shift register

modulo2

Generator polynomial g(x)=g0+g1x+g2x2+ +gn-1xn-1+xn Input:f(x), Output:q(x), Remainder:R(x) D0,D1,D2,,,,Dn

124

68

Training material of NHK-CTI

Review of polynomial Training material of NHK-CTI

The coefficient of n-th power of x in polynomial expression shows the corresponding bit position of the code sequence Code sequence c = (c1, c2, c3, ・・・・・, cn) is expressed by polynomial c (x ) =

c 1 x n -1 + c 2 x n - 2 + ⋅ ⋅ ⋅ ⋅ ⋅ + c n -1 x + c n

Information sequence i = (i1, i2, i3, ・・・・・, ik) is expressed by polynomial i( x ) = i 1 x k -1 + i 2 x k - 2 + ⋅ ⋅ ⋅ ⋅ ⋅ + i k -1 x + i k

Codeword c( x ) = x n − k i( x )g( x ) + p( x )

Generator polynomial

p(x): remainder

69

Training material of NHK-CTI

Hamming code Training material of NHK-CTI

Capable of single error correction code Code length n ik-1





・ ・ i1 i0

Information bit length: k  Code length  Information bit length  Parity bit length  Minimum length

pm-1

・ p1 p0

Parity bit length: m

n = 2r - 1(bit) k = n - m(bit) m = rt (bit) dmin = 2t + 1 = 3 (bit) ( t = 1)

r:Integer of three or more t:Number of error correcting capability 70

Training material of NHK-CTI

Hamming code (7,4) Training material of NHK-CTI

「Hamming code (7,4)」 Code length n = 7,information bit length k = 4,parity bit length m = 3 Generator polynomial: X3+X+1 Information code i3 i2 i1 i0

i3 ii2 i1 i0 000 1011

Codeword i3 i2 i1 i0 p2 p1 p0

remainder p=(p2 p1 p0)

Expression of generator polynomial 71

Training material of NHK-CTI

How to find parity check bit Training material of NHK-CTI

Information bit sequence (i3 i2 i1 i0)) is shifted to the left by three bits, it becomes (i3 i2 i1 i0 000), then divide it by generator polynomial, and find remainder 1 0 1 1

1

0

0

1

(i3)

(i2)

(i1)

(i0)

(p2)

(p1)

(p0)

1 1

0 0

1 1

0 1

0

0

0

1 1

0 0 0

0 1 1

0 1 1

Information bit

Parity check bit

Codewords is 1010011 72

Training material of NHK-CTI

Hamming codeword (7,4) Training material of NHK-CTI

Information bit

Codeword

i3 i2 i1 i0

i3 i2 i1 i0 p2 p1 p0

0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1

0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1

0 0 1 0 1 1 1 1 0 0 0 0 1 1 1 1

0 0 1 1 0 0 1 1 0 0 1 1 0 0 1 1

0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1

0 0 1 0 1 1 1 1 0 0 0 0 1 1 1 1

0 0 1 1 0 0 1 1 0 0 1 1 0 0 1 1

0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1

0 0 1 1 1 1 0 0 1 1 0 0 0 0 1 1

0 1 1 0 1 0 0 1 0 1 1 0 1 0 0 1

0 1 0 1 1 0 1 0 1 0 1 0 0 1 0 1 73

Training material of NHK-CTI

Error example in Hamming code (7,4) Training material of NHK-CTI

Information bit sequence (1010)→codeword sequence (1010011) If error is generated to the first bit Codeword (1010011) →(0010011) Divide by generator polynomial: (1011)→ remainder (101)

-

If error is generated to the second bit Codeword (1010011) →(1110011) Divide by generator polynomial: (1011) → remainder (111)

-

0010011 1011 101

1110011 1011 101011 1011 111 74

Training material of NHK-CTI

Single bit error in Hamming code (7,4) Training material of NHK-CTI

i3

Codeword i2 i1 i0 p2 p1 p0

0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1

0 0 1 0 1 1 1 1 0 0 0 0 1 1 1 1

0 0 1 1 0 0 1 1 0 0 1 1 0 0 1 1

0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1

0 0 1 1 1 1 0 0 1 1 0 0 0 0 1 1

0 1 1 0 1 0 0 1 0 1 1 0 1 0 0 1

0 1 0 1 1 0 1 0 1 0 1 0 0 1 0 1

Remainder divided by g(x) when one bit error occurs i3 error i2 error i1error i0error p2 error p1 error p0 error 101 101 101 101 101 101 101 101 101 101 101 101 101 101 101 101 101

111 111 111 111 111 111 111 111 111 111 111 111 111 111 111 111 111

110 110 110 110 110 110 110 110 110 110 110 110 110 110 110 110 110

011 011 011 011 011 011 011 011 011 011 011 011 011 011 011 011 011

100 100 100 100 100 100 100 100 100 100 100 100 100 100 100 100 100

010 010 010 010 010 010 010 010 010 010 010 010 010 010 010 010 010

001 001 001 001 001 001 001 001 001 001 001 001 001 001 001 001 001 75

Training material of NHK-CTI

Error correction by Hamming code Training material of NHK-CTI

 Divide received word by generator polynomial  if divisible → No error  if indivisible → Error detected An erroneous bit position is judged by the form of the remainder The error is corrected by reversing (0→1,1→0) the bit of the erroneous bit position

Possible to correct single error More than or equal to two bits error → misscorrection or undetection 76

Training material of NHK-CTI

Example of Hamming code Training material of NHK-CTI

Single error correction code (dmin=3) Hamming (7,4) Generator bit: 「1011」 x3+x+1 Hamming (15,11) Generator bit: 「10011」 x4+x+1

dmin=3 Generator polynomial

Hamming (31,26) Generator bit: 「100101」 x5+ x2 +1 77

Training material of NHK-CTI

Eb/No vs BER (Hamming code)

Training material of NHK-CTI

10 0

Bit error rate

10 -1 10 -2 10 -3

No coding H(63,57)

10 -4 H(7,4)

10 -5 10 -6 0

0.4dB 2.4dB

1

2

3

4

5

(63,57) (7,4)

6

Eb/No (dB)

7

8

9

10

C/N(dB) 78

Training material of NHK-CTI

BER before vs after error correction (Hamming code) Bit error rate after error correction

Training material of NHK-CTI

10

0

10

-1

10

-2

10

-3

10

-4

10

-5

10

-6

10

(63,57) (7,4)

-3

10

-2

10

-1

Bit error rate before error correction

79

Training material of NHK-CTI

BCH code Training material of NHK-CTI

(Bose Chaudhuri Hochquenghem ) 80

Training material of NHK-CTI

BCH code Training material of NHK-CTI

Capable of more than two or equal bits error correction code The degree of freedom in encoding is high Code length: n ik-1









i1 i0

Information bit length: k Code length Information bit length Parity bit length Minimum distance

pm-1



p1 p0

Parity bit length: m r

n ≦ 2 -1 k=n-m m ≦rt dmin≧2t + 1

(bit) (bit) (bit) (bit)

r: Integer of more than equal three t: Number of error correction capability 81

Training material of NHK-CTI

BCH code (15,7) Training material of NHK-CTI

「BCH(15, 7)codeword」 (r=4,t=2) Code length: n = 15, Information bit length: k = 7, Parity bit length: m = 8 Generator polynomial:g(x)=g1(x)g3(x)=x8+x7+x6+x4+1 Information bits i6 i5 i4 i3 i2 i1 i0

i6 i5

Codeword i4 i3 i2 i1 i0 p7 p6 p5 p4 p3 p2 p1 p0

i6i5i4i3i2i1io00000000 111010001

Remainder: R=p7p6p5p4p3p2p1p0 82

Training material of NHK-CTI

BCH code (15,7) Training material of NHK-CTI

Bit expression of Generator polynomial:111010001 Generator polynomial: g(x)=x8+x7+x6+x4+1 =(x4+x+1)(x4+x3+x2+x+1) Where let g1(x), g3(x) be as follows; g1(x)= x4+x+1 g3(x)=x4+x3+x2+x+1 g(x) becomes as follows; g(x)=g1(x)g3(x) 「If codeword polynomial is divisible by generator polynomial g(x) 」 means 「it is also divisible by polynomial g1(x) or g3(x) 」

83

Training material of NHK-CTI

Single bit error in BCH code (15,7) Training material of NHK-CTI

i6

i5 i4 i3 i2 i1 i0 p7 p6 p5 p4 p3 p2 p1 p0 Bit error position (single bit error)

g1(x)=x4+x+1 「10011」 g3(x)=x4+x3+x2+x+1 「11111」 Remainder divided by g1(x), g3(x)

Position of error

Remainder of g1(x)

Remainder of g3(x)

i6 i5 i4 i3 i2 i1 i0 p7 p6 p5 p4 p3 p2 p1 p0

1001 1101 1111 1110 0111 1010 0101 1011 1100 0110 0011 1000 0100 0010 0001

1111 1000 0100 0010 0001 1111 1000 0100 0010 0001 1111 1000 0100 0010 0001

84

Training material of NHK-CTI

2 bits error in BCH code (15,7) Training material of NHK-CTI

i6 i5 i4 i3 i2 i1 i0 p7 p6 p5 p4 p3 p2 p1 p0 Position

R of g1(x)

R of g3(x)

Position

R of g1(x)

R of g3(x)

Position

R of g1(x)

R of g3(x)

i6, i5 i6, i4 i6, i3 i6, i2 i6, i1 i6, i0 i 6 , p7 i 6 , p6 i 6 , p5 i 6 , p4 i 6 , p3 i 6 , p2 i 6 , p1 i 6 , p0 i5, i4 i5, i3 i5, i2 i5, i1 i5, i0 i 5 , p7 i 5 , p6 i 5 , p5 i 5 , p4 i 5 , p3 i 5 , p2 i 5 , p1 i 5 , p0 i4, i3 i4, i2 i4, i1 i4, i0 i 4 , p7 i 4 , p6 i 4 , p5 i 4 , p4

0100 0110 0111 1110 0011 1100 0010 0101 1111 1010 0001 1101 1011 1000 0010 0011 1010 0111 1000 0110 0001 1011 1110 0101 1001 1111 1100 0001 1000 0101 1010 0100 0011 1001 1100

0111 1011 1101 1110 0000 0111 1011 1100 1110 0000 0111 1011 1101 1110 1100 1010 1001 0111 0000 1100 1010 1001 0111 0000 1100 1010 1001 0110 0101 1011 1100 0000 0110 0101 1011

i 4 , p3 i 4 , p3 i 4 , p1 i 4 , p0 i3, i2 i3, i1 i3, i0 i 3 , p7 i 3 , p6 i 3 , p5 i 3 , p4 i 3 , p3 i 3 , p2 i 3 , p1 i 3 , p0 i2, i1 i2, i0 i 2 , p7 i 2 , p6 i 2 , p5 i 2 , p4 i 2 , p3 i 2 , p2 i 2 , p1 i 2 , p0 i1, i0 i 1 , p7 i 1 , p6 i 1 , p5 i 1 , p4 i 1 , p3 i 1 , p2 i 1 , p1 i 1 , p0 i 0 , p7

0111 1011 1101 1110 1001 0100 1011 0101 0010 1000 1101 0110 1010 1100 1111 1101 0010 1100 1011 0001 0100 1111 0011 0101 0110 1111 0001 0110 1100 1001 0010 1110 1000 1011 1110

1100 0000 0110 0101 1011 1101 1010 0101 0000 0011 1101 1010 0110 0000 0011 1110 1001 0101 0011 0000 1110 1001 0001 0101 0000 0111 1011 1101 1110 0000 0111 1011 1101 1110 1100

i 0 , p6 i 0 , p5 i 0 , p4 i 0 , p3 i 0 , p2 i 0 , p1 i 0 , p0 p7, p6 p7, p5 p7, p4 p7, p3 p7, p2 p7, p1 p7, p0 p6, p5 p6, p4 p6, p3 p6, p2 p6 , p 1 p6, p0 p5, p4 p5, p3 p5, p2 p5, p1 p5, p0 p4, p3 p4, p2 p4, p1 p4, p0 p3, p2 p3, p1 p3, p0 p2, p1 p2, p0 p1, p0

1001 0011 0110 1101 0001 0111 0100 0111 1101 1000 0011 1111 1001 1010 1010 1111 0100 1000 1110 1101 0101 1110 0010 0100 0111 1011 0111 0001 0010 1100 1010 1001 0110 0101 0011

1010 1001 0111 0000 1100 1010 1001 0110 0101 1011 1100 0000 0110 0101 0011 1101 1010 0110 0000 0011 1110 1001 0101 0011 0000 0111 1011 1101 1110 1100 1010 1001 0110 0101 0011

85

Training material of NHK-CTI

Remainder and bit error position in BCH (15,7) Training material of NHK-CTI

i6 i5 i4 i3 i2 i1 i0 p7 p6 p5 p4 p3 p2 p1 p0 Remainder of g3(x)

0000 1000 0100 1100 Remai 0010 nder of 1010 g1(x) 0110 1110 0001 1001 0101 1101 0011 1011 0111 1111

0000

1000

i5,i0 i4,p7 i3,p1 i3,p6 i6,p4 i2,p0 p6,p1 i2,p5 i1,p4 i5,p3 i0,p3 i6,i1 i4,p2 p5,p0 p7,p2

p3

0100

1100

0010

p2

1010

p6,p2

i6,p0 i2,p4 i1,p5 p4,p0

p6 p1 p3,p1 i3,p3

p7,p3 i4,p3

i3,p2 p2,p1

i3

0001

i5,p6 i0,p6

i5,i3 i3,i0 i0,p1 i5,p1

i4,i3 p7,p1 i3,p7 i4,p1 i4,p6 p7,p6

1001

i0,p0 i5,p0 i2,i0 i5,i2

0101

1101

0011

1011

i4,i2

i1,p1 i3,i1

i3,p5 p5,p1

p7,p4

i2,p7 p5,p2 p7,p0

p5

p5,p4 p2,i1

i6,p5

i6,i4 i1,p2 i1,p7

p4,p1 p3,p0

i2

p6,p5

i4,p0

p0

i1,p0

i4,p4 i6,p7 i1,p6

p5,p3

i6,i2

i0 i5

i4

1110

p6,p3 p3,p2 i5,i4 i4,i0 i5,p7 i0,p7 i0,p2 i5,p2

p7

0110

i0,p5 i5,p5 i2,p3

i4,p5 p2,p0 p7,p5 i2,p2

i6,p6 i3,p4 i6,p1 i6,i3 p6,p4

i3,i2 i2,p1 p6,p0 p1,p0 i2,p6 i3,p0

0111

i6,i5 i6,i0 i1,p3 i0,p4 i5,p4 i6,p3

1111

i1

i6 i4,i1 i6,p2 p4 p4,p2

p4,p3 i5,i1 i1,i0

Ex. single bit error: i3 ←→ remainder of g1(x) :1110, remainder of g3(x): 0010 Ex. 2 bits error :i5, i3← → remainder of g1(x) :0011, remainder of g3(x): 1010 86

Training material of NHK-CTI

Example of BCH code Training material of NHK-CTI

t bit error correction capability (dmin≧2t+1) BCH (7,4) :t=1 Generator bit :「1011」 g(x)=x3+x+1 BCH (15,7) :t=2 Generator bit:「111010001」 g(x)=x8+x7+x6+x4+1

dmin=3

dmin=5 dmin=7

BCH (31,16) : t=3 Generator bit:「1000111110101111」 g(x)=x15+x11+x10+x9+x8+x7+x5+x3+x2+x+1

87

Training material of NHK-CTI

Reed--Solomon code Reed Training material of NHK-CTI

88

Training material of NHK-CTI

Reed--Solomon code Reed Training material of NHK-CTI

Symbol error correction code (in general byte) The degree of freedom in encoding is high Code length: n ik-1









i1

Information symbol length: k Code length Information symbol length Parity symbol length Minimum distance

i0

pm-1



p1 p0

Parity symbol length: m n ≦ 2r - 1 (Symbol) k = n - m (Symbol) m = 2t (Symbol) dmin≧2t+1 (Symbol)

r:Number of bits comprising symbol t: Number of symbol capable of error correction

89

Training material of NHK-CTI

Feature of roots of generator polynomial Training material of NHK-CTI

Let α be a primitive element in GF(24) which is constructed based on the primitive polynomial ; g(x) where g(x)=x3+x+1 α satisfies the following equation 3 α3+α+1=0 α =1+α 4 2 α =α(1+α)==α+α Field of 8 elements generated by x +x+1 5 2 2 3 Polynomial 3-Tuple Power form Symbol α =α(α+α )==α +α form form 0 0 000 0   =1+α+α2 1 100 4 6 2 α =1 α =α(1+α+α ) α 010 2 α =α 2 3 =α+α +α 001 1 α α 1 +α 110 6 α  =α+α2+(1+α) 011 3 2 α α +α =1+α 111 7 1 +α +α α 7 2 3 α =α(1+α )=α+α 1 0 1 5 α 1 + α 1 100 1 α =α   =α+(1+α)=1 3

0 1 2

2

3 4

2

5

2

6

2

7

0

90

Training material of NHK-CTI

Addition of roots for generator polynomial Training material of NHK-CTI

Vector expression 3-Tuple Power 000 100 010 001 110 011 111 101

0 1 α α2 3 α α4 5 α α6

Vector expression 3-Tuple Power 000 0 100 1 010 α 2 001 α 110 α3 4 011 α 5 111 α 6 101 α

000 0

100 1

010 α

001 2 α

110 3 α

011 4 α

111 5 α

101 6 α

0 1 α α2 3 α α4 5 α α6

1 0 α3 α6 α α5 4 α α2

α 3 α 0 α4 1 α2 6 α α5

α2 6 α α4 0 5 α α 3 α 1

α3 α 1 α5 0 α6 2 α α4

α4 5 α α2 α 6 α 0 1 α3

α5 4 α α6 α3 2 α 1 0 α

α6 2 α α5 1 4 α α3 α 0

000 0 000 100 010 001 110 011 111

100 1 100 000 110 101 010 111

010 α 010 110 000 011 100 001 101

001 α2 001 101 011 000 111 010

110 α3 110 010 100 111 000 101

011 α4 011 111 001 010 101 000

110 100

001 011

100 110

111 α5 111 011 101 110 001 100 000

101 α6 101 001 111 100 011 110 010

010

000

101

011 001

111

91

Training material of NHK-CTI

Multiplication of roots for generator polynomial Training material of NHK-CTI

Vector expression 3-Tuple Power 000 0 100 1 010 α 2 001 α 3 110 α 011 α4 5 111 α 101

6

α

Vector expression

000 0 0 0 0 0 0 0 0

100 1 0 1 α

0

α

α2 3 α 4 α α5 6 α 1

000

100

2

α 3 α α4 5 α 6

010 α 0 α

001 α2 0 2 α α3 4 α 5 α α6 1

110 α3 0 3 α α4 5 α 6 α 1 α

α

α

α

α

α

010

001

110

011

111

101

3-Tuple

Power

0

1

α

α

000 100 010 001 110 011 111

0 1 α α2 α3 4 α α5

000 000 000 000 000 000 000

000 100 010 001 110 011 111

000 010 001 110 011 111 101

101

α6

000

101

100

2

2

3

011 α4 0 4 α α5 6 α 1 α 2

α

3

4

111 α5 0 5 α α6 1 α α2 3 α 4

5

101 α6 0 6 α 1 α 2

α α3 4 α 5

6

α

α

α

α

000 001 110 011 111 101 100

000 110 011 111 101 100 010

000 011 111 101 100 010 001

000 111 101 100 010 001 110

000 101 100 010 001 110 011

010

001

110

011

111 92

Training material of NHK-CTI

RS code (5,3) Training material of NHK-CTI

「RS code(5,3)」(r=3, 1symbol=3 bits) Code length: n=5, information symbol length: k=3, Parity bit symbol length: m=2 Information bits A B C

Codeword A B C P Q

α: a root of g(x)=x3+x+1=0 A+B+C+P+Q=0 Parity check equation 2 3 4 5 αA+α B+α C+α P+α Q=0 From above equations, P and Q are as follows: P= α6A+α3B+α2C Q=α2A+αB+α6C 93

Training material of NHK-CTI

Characteristics of error in RS code (5,3) Training material of NHK-CTI

Set syndrome S as follows; S1=A+B+C+P+Q S2=αA+α2B+α3C+α4P+α5Q

Codeword

A B C P Q No error S1=0 and S2=0

One symbol error If symbol A is error, and it becomes A+A’ S1=(A+A’)+B+C+P+Q=A’ S2=α(A+A’)+α2B+α3C+α4P+α5Q=αA’ ∴S2=αS1 If symbol B is error, and it becomes B+B’ S1=A+(B+B’)+C+P+Q=B’ S2=αA+α2(B+B’)+α3C+α4P+α5Q= α2B’ ∴S2=α2S1

94

Training material of NHK-CTI

Characteristics of error in RS code (5,3) Training material of NHK-CTI

One symbol error If symbol C is error, and it becomes C+C’ S1=A+B+(C+C’)+P+Q=C’ S2=αA+α2B+α3(C+C’)+α4P+α5Q=α3C’ ∴S2=α3S1

Symbol A → S2=αS1 Symbol B → S2=α2S1 Symbol C → S2=α3S1 Detection of error location → from above equations and S1is error pattern Error correction → Add S1 to the error detected symbol

Possible to correct one symbol 95

Training material of NHK-CTI

Example of RS code (5,3) encoding Training material of NHK-CTI

Let A,B,C set as follows; (1 symbol is 3 bits) A=100, B=010, C=001 A=1, B=α, C=α2 P=α6A+α3B+α2C =α6・1+α3・α+α2・α2 =α6+α4+α4 =α6 =101

Q=α2A+αB+α6C =α2・1+α・α+α6・α2 =α2+α2+α8 =α8 =α =010

A B C P Q 100 010 001 101 010 96

Training material of NHK-CTI

Error correction example of RS code (5,3) Training material of NHK-CTI

If symbol A is error A=100 → 011 (3 bits error) S1=A+B+C+P+Q =α4+α+α2+α6+α =α2+α4+α6 =α+α6 = α5 =111

S2=αA+α2B+α3C+α4P+α5Q =α・α4+α2・α+α3・α2 +α4・α6 +α5・α =α5+α3+α5 +α10+α5 =α3+α3+α6 =α6 =αS1

S2 =αS1, ;error location is symbol A A= 011+S1=011+111=100 97

Training material of NHK-CTI

Examples of RS code Training material of NHK-CTI

 t symbol error correction code (dmin=2t+1) Suitable for burst error correction RS(32,30) code, etc. RS(64,60) code, etc. RS(128,120) code, etc. RS(204,188) code, etc. RS(207,187) code, etc.

:t=1 :t=2 :t=4 :t=8 :t=10

98

Training material of NHK-CTI

Convolutional code Training material of NHK-CTI

99

Training material of NHK-CTI

Trellis diagram of convolutional code Training material of NHK-CTI

:1bit delay

+ 11010 Input

a1 a2 a3 a1

a2 +

State (a3 a2)

a3

+ :Adder

01/0

11/0

00/0 00 10/0 00/1 11 10/1

Output c1c2

c2

c1=a1+a2 +a3 c2=a1 +a3

11/1

01/1 01

c1c2/a1

State C:(1 0)

00/0 00/1 11/0 11/1 10/0 10/1

State D:(1 1)

01/0 01/1

State A:(0 0) State B:(0 1)

133

c1

10

100

Training material of NHK-CTI

Principal of Viterbi decoding Training material of NHK-CTI

Examine cheaper route, whenever the railroad joins Select the route which a fare is cheaper Examine the cheaper route one after another, finally find the cheapest route 130/440 Kasumigaseki

160 Shinjyuku

150/310 150/310 Harajyuku

160 Omotesandou

160/320

Uehara

Simokitazawa

180/620 Seijyou Seijyo

160/470 120/430

210/520

150/310 160/320 Shibuya

101

Training material of NHK-CTI

Error correction by Viterbi decoding Training material of NHK-CTI

Information message 0 Transmitted message 00

1

0

1

1

0

0

0

11

10

00

01

01

11

00

Received message

10

11

11

00

01

01

11

00

1 (1)

2 (3)

State A:(0 0) 1 (1)

0 (1)

(5) (2) (3)

State B:(0 1) 1 (2)

State C:(1 0)

(4) (2) (3)

1 (2)

State D:(1 1)

(2) (3)

c1c2/a1

1 (2)

Blanch metric 145

Path metric

00/0 00/1 11/0 11/1

10/0 10/1 01/0 01/1 102

Training material of NHK-CTI

Key point Training material of NHK-CTI

Block code Hamming code Single error correction BCH code t bit error correction Reed solomon code t symbol error correction Convolutional coding +Viterbi decode 103

Training material of NHK-CTI

Application in broadcasting field

Training material of NHK-CTI

104

Training material of NHK-CTI

Applications of error correcting code Training material of NHK-CTI

Usage Transmission

Digital broadcasting

Conputor Data Record Video Audio

Device

Type of error

Feature

Ex. 0f Code

Receiver

Random & Burst

Low error rate

Convolutional + RS

Semiconductor memory Magnetic memory, Optical memory (HDD,Tape,CD-ROM)

Random

Hight speed & low error rate

Hamming

(Random) Burst

Low error rate

CRC RS product

Magnetic memory, Optical memory (VTR,CD,DVD)

(Random) Burst

Relatively high error rate than data

RS product

RS: Reed Solomon

105

Training material of NHK-CTI

Applications for broadcasting VTR Training material of NHK-CTI

VTR

Width of tape

Error correction code

Video System

(inch)

Inner code

Outer code

D1

SD

Component

3/4

RS (64,60)

RS (32,30)

D2

SD

Composite

3/4

RS (93,85)

RS (68,64)

D3

SD

Composite

1/2

RS (95,87)

RS (136,128)

D5

SD,HD

Component

1/2

RS (95,87)

RS (128,120)

β CAM

SD

Component

1/2

RS (178,164)

RS (106,96)

D6

HD

Component

3/4

RS (227,211)

RS (254,240)

HD1

HD

Component

1

RS (110,104)

RS (65,60)

106

Training material of NHK-CTI

Applications for broadcasting Training material of NHK-CTI

Error correction code Broadcasting Inner code Digital Terrestrial Television Broadcasting DVB-T, ISDB-T ATSC

Convolutional code

Outer code RS (204,188)

(r=1/2,2/3,3/4,5/6,7/8) Trellis code (r=2/3)

RS (207,187)

Convolutional code(1/2) Digital Satellite Television Broadcasting ISDB-S

Trellis code (r=2/3) Convolutional code

RS (204,188)

(r=1/2,2/3,3/4,5/6,7/8) Digital Communication Satellite Television Broadcasting

Convolutional code

RS (204,188)

(r=1/2,2/3,3/4,5/6,7/8)

107

Training material of NHK-CTI

Application of error correction technique Training material of NHK-CTI

Magnetic tape (RS code) Magnetic disk (RS code) Compact disk (RS code) Digital audio tape (RS code) Mobile phone (RS & Convolutional coding) Satellite communication (RS & Convolutional coding) Satellite broadcasting (RS & Convolutional coding) Teletext Teletext broadcasting (Cyclic code ) Digital broadcasting (RS & Convolutional coding) Intercontinental optical communications (RS code) 108

Training material of NHK-CTI

Finally Training material of NHK-CTI

The error correction is indispensable for digital technique.

Importance of the error correction technology is increasing further because of high compression of source coding.

109

Training material of NHK-CTI

Informative reference Training material of NHK-CTI

110

Training material of NHK-CTI

Reed--Solomon generator Reed Training material of NHK-CTI

Connect for first (k) Bytes Open for last (N-k) Bytes

Gate α=174

α=165

X1

α=121

X2

α=121

α=181

X3

α=152

X20

X19

N-k=20 Parity Bytes

Mod(256) add two field elements (Bytes)

B

Mod(256) multiply a field element with a fixed element α X

Source one element (Byte)

Primitive Field Generator Polynomial (Galois):P(X) P(X)=X8+X4+X3+X2+1

N=207 Encoded Data

k=187 Data Bytes

A Connect A for first (k) Bytes Open B for last (N-k) Bytes

Code Generator Polynomial: G(X) G(X)=X20+152X19+185X18+240X17+5X16+111X15+99X14+6X13+220X12+112X11+150X10+69X9+36X8+187X7+22X6 +228X5+198X4+121X3+121X2+165X1+174

111

Training material of NHK-CTI

Punctured code Training material of NHK-CTI

B0

Convolutional coder

X

P1 P0

Punctured Y

Improvement of coding rate

+ B0



D



D



D



+ D : 1bit delay

Convolutional code r=1/1

X

171octal

D

D

r=6/8=3/4 P1=X1,Y2,Y3 P0=Y1,X3,Y4 X:X1,X2,X3,X4 Y:Y1,Y2,Y3,Y4



D

133octal

+ :Exclusive OR



r=4/6=2/3 P1=X1,Y2 P0=Y1,X3

Y

X:X1,X2,X3 Y:Y1,Y2,Y3 112

Training material of NHK-CTI

Linear code and cyclic code Training material of NHK-CTI

Linear code

Cyclic code

None linear code

None cyclic code

Block code

Linear code If C1 and C2 is two arbitrary codewords, added (C1+C2 ) is also codeword

Cyclic code If message (n-bits in length) is expressed by polynomial, cyclic code is the all codewords (code polynomial) that are divisible by generator polynomial G(X) If Cn=(x1,x2,x3,・・,xN) is codeword, each element shifted Cm=(x2,x3,・・ xN,x1)is also codeword 113

Training material of NHK-CTI

Example Training material of NHK-CTI

Linear code Hamming code (7,4) 0000 000 0001 011 0010 110 0011 101 0100 111 0101 100 0110 001 0111 010

1000 101 1001 110 1010 011 1011 000 1100 010 1101 001 1110 100 1111 111

0001 011 (C1) + 0010 110 (C2) 0011 101 (C1+C2=C3)

Cyclic code 0111 010 (C7) 1bit shift to left 1110 100 (C7→C14) 114

Training material of NHK-CTI

Coding rate Training material of NHK-CTI

Number of bits for information (k) Coding rate = Number of bits for codeword (n)

coding rate r=k/n

Example : (6,4) code coding rate: 2/3

1011

01

Capability of error correction high low

Block code : (n,k)

Code B

Code A high low

Coding rate 115

Training material of NHK-CTI

Appendix Concatenated code

Training material of NHK-CTI

116

Training material of NHK-CTI

Example of concatenated code Training material of NHK-CTI

MPEG-2 Source Coding & Multiplexing Encoder

1

Encoder n Encoder 1 Encoder n

Mux Adaptation Energy Dispersal

Outer coder

Outer interleaver

Inner coder

Mux Adaptation Energy Dispersal

Outer coder

Outer interleaver

Inner coder

To Antenna

Inner interleaver

Frame Mapper

Adaptation

OFDM

Guard Interval Insertion

D/A LPF

Front End

Pilots & TPS Signals

Terrestrial Channel Adapter

DVB-T functional block diagram 117

Training material of NHK-CTI

Concept of interleave Training material of NHK-CTI

118

Training material of NHK-CTI

Interleaving Training material of NHK-CTI

Interleaving is a way to rearrange data in order to increase performance to cope with burst error.

Direction of read

Direction of write

Conceptional diagram of interleaving 119

Training material of NHK-CTI

Example of concatenated code (Turbo code) Training material of NHK-CTI

RSC

Interleaver

Additive White Gaussian Noise Channel (AWGN)

Deinterleaver

DEC1

RSC

Turbo encoder

Interleaver

DEC2

Deinterleaver

Turbo decoder

RSC: Recursive systematic convolutional encoder DEC: Soft-in / soft-out decoder

120

Training material of NHK-CTI

Example of concatenated code (Product code) Training material of NHK-CTI

The random & burst error is effectively corrected n1

n2 k2

Information

Parity of C1

k1

C1:(n1,k1) code

Parity of C2

C2(n2,k2) code 121

Training material of NHK-CTI

Example of product code by RS code Training material of NHK-CTI

Error

B11 B12 B13 B14 B15 P11 P12

(1) Can not correct

B21 B22 B13 B24 B25 P21 P22

(2) B21 correct

B31 B32 B33 B34 B35 P31 P32 B41 B42 B43 B44 B45 P41 P42 Q11 Q12 Q13 Q14 Q15 P51 P52 Q21 Q22 Q23 Q24 Q25 P61 P62

Bii Pik Qnj

Information symbol Parity symbol

(5) B13 correct (3) B11 correct (6) B14 correct (4) B12 correct 122

Training material of NHK-CTI

Appendix Cyclic coder

Training material of NHK-CTI

123

Training material of NHK-CTI

Cyclic code by shift register (1) 0

Output: q(x)

0

x3

0

0 +x

1 1 x6 +x5

0

x3+x+1

(x6+x5+x3+x)+x+1= x6+x5+x3+1

0 1 0 1 0 +x3

+x

Input: f(x)

+1 +x3+x2+x+1

Cyclic code:

Training material of NHK-CTI

x6+x5 +x3 x6 +x4+x3

Output: q(x)

+x

x5+x4 x5 +x3+x2 x4+x3+x2+x x4 +x2+x x3 x3 Remainder: R(x)

+x+1 x+1 124

Training material of NHK-CTI

Cyclic code by shift register (2) 0 x3

0

0 +x

1

1 0

0

x5

Training material of NHK-CTI

1 0 1 0 +x3

+x

+1

125

Training material of NHK-CTI

Cyclic code by shift register (3) 0 x3

1

0 +x

1

Training material of NHK-CTI

0 1 0 1 0

0

+x3

+x

+1

126

Training material of NHK-CTI

Cyclic code by shift register (4) 1 x3

1

1 +x

0

Training material of NHK-CTI

1 0 1 0

1

+x3

+x

+1

127

Training material of NHK-CTI

Cyclic code by shift register (5) 1

1

1

0

1 +x

x3

0 1 0

1101010 1011 1100

+x

1 +1

1 1011

Training material of NHK-CTI

x3 x3+x+1

x6+x5 +x3 x6 +x4+x3

+x

x5+x4

128

Training material of NHK-CTI

Cyclic code by shift register (6) 11

1

1

1

1 +x

x3

1101010 1011 1100 1011 1111

1 0 +x

1 +1

+x3+x2

11 1011

Training material of NHK-CTI

x3+x+1

x6+x5 +x3 x6 +x4+x3

+x

x5+x4 x5 +x3+x2 x4+x3+x2+x

129

Training material of NHK-CTI

Cyclic code by shift register (7) 111

1

0

0

1 +x

x3

1101010 1011 1100 1011 1111 1011 1000

0

1 +1 +x3+x2+x

111 1011

Training material of NHK-CTI

x3+x+1

x6+x5 +x3 x6 +x4+x3

+x

x5+x4 x5 +x3+x2 x4+x3+x2+x x4 +x2+x x3

130

Training material of NHK-CTI

Cyclic code by shift register (8) 1111

0

Output: q(x)

1

1 +x

x3

Training material of NHK-CTI

+1

Remainder (contents of resisters) : R(x) +x3+x2+x+1 Output: q(x)

1111 1011

1101010

x3+x+1

1011 1100 1011

+x

x5+x4 x5 +x3+x2 x4+x3+x2+x x4 +x2+x x3

1111 1011 1000 1011 011

x6+x5 +x3 x6 +x4+x3

x3 Remainder: R(x)

+x+1 x+1 131

Training material of NHK-CTI

Convolutional coder & decoder

Training material of NHK-CTI

132

Training material of NHK-CTI

Trellis diagram :1bit delay

+ 11010 Input

c1

a1 a2 a3 a1

a2 +

a3

+ :Adder

10 01/0

11/0

00/0 00 10/0 00/1 11 10/1

Output c1c2

c2

c1=a1+a2 +a3 c2=a1 +a3

11/1

01/1 01

State (a3 a2)

c1c2/a1

State A:(0 0)

State C:(1 0)

00/0 00/1 11/0 11/1 10/0 10/1

State D:(1 1)

01/0 01/1

State B:(0 1)

133

Training material of NHK-CTI

Trellis diagram (1) :1bit delay

+ 11010 Input

c1

0 0 a1

State (a3 a2) State A:(0 0)

a2 +

a3

10

+ :Adder

01/0

11/0 00/0

00

10/0

00/1

11

10/1

Output c1c2

c2

c1=a1+a2 +a3 c2=a1 +a3

11/1

01/1 01

c1c2/a1 00/0 00/1 11/0

State B:(0 1)

11/1 10/0

State C:(1 0)

10/1 01/0

State D:(1 1)

01/1 134

Training material of NHK-CTI

Trellis diagram (2) :1bit delay

+ 1101 Input

0

0 0 0 a1

State (a3 a2) State A:(0 0)

a2 +

10

+ :Adder 00/0

a3 0 00

01/0

11/0 00

10/0

00/1

11

10/1

Output c1c2

0

c1=a1+a2 +a3 c2=a1 +a3

0 0

11/1

01/1 01

c1c2/a1 00/0 00/1 11/0

State B:(0 1)

11/1 10/0

State C:(1 0)

10/1 01/0

State D:(1 1)

01/1 135

Training material of NHK-CTI

Trellis diagram (3) :1bit delay

+ 110 Input

c1

1 0 0 a1

State (a3 a2) State A:(0 0)

a2 +

10

+ :Adder 00/0

a3 0 00

01/0

11/0 00

10/0

00/1

11

10/1

Output c1c2

c2

c1=a1+a2 +a3 c2=a1 +a3

11/1

01/1 01

c1c2/a1 00/0 00/1 11/0

State B:(0 1)

11/1 10/0

State C:(1 0)

10/1 01/0

State D:(1 1)

01/1 136

Training material of NHK-CTI

Trellis diagram (4) :1bit delay

+ 110 Input

1

+ :Adder

1 0 0 a1

State (a3 a2)

a2 +

10 01/0

11/0

00/0 00 10/0 00/1 11 10/1

a3 0 00

Output c1c2

1

c1=a1+a2 +a3 c2=a1 +a3 1 11

1

11/1

01/1 01

1

c1c2/a1

State C:(1 0)

00/0 00/1 11/0 11/1 10/0 10/1

State D:(1 1)

01/0 01/1

State A:(0 0) State B:(0 1)

137

Training material of NHK-CTI

Trellis diagram (5) :1bit delay

+ 11 Input

c1

+ :Adder

0 1 0 a1

State (a3 a2)

a2 +

10 01/0

11/0

00/0 00 10/0 00/1 11 10/1

a3

Output c1c2

c2

0 00

c1=a1+a2 +a3 c2=a1 +a3 1 11

11/1

01/1 01

c1c2/a1

State C:(1 0)

00/0 00/1 11/0 11/1 10/0 10/1

State D:(1 1)

01/0 01/1

State A:(0 0) State B:(0 1)

138

Training material of NHK-CTI

Trellis diagram (6) :1bit delay

+ 11 Input

1

+ :Adder

0 1 0 a1

State (a3 a2)

a2 +

10 01/0

11/0

00/0 00 10/0 00/1 11 10/1

a3 0 00

Output c1c2

0

c1=a1+a2 +a3 c2=a1 +a3 1 11

0 10

1

11/1

01/1 01

0

c1c2/a1

State C:(1 0)

00/0 00/1 11/0 11/1 10/0 10/1

State D:(1 1)

01/0 01/1

State A:(0 0) State B:(0 1)

139

Training material of NHK-CTI

Trellis diagram (7) :1bit delay

+ 1 Input

c1

+ :Adder

1 0 1 a1

State (a3 a2)

a2 +

10 01/0

11/0

00/0 00 10/0 00/1 11 10/1

a3

Output c1c2

c2

0 00

c1=a1+a2 +a3 c2=a1 +a3 1 11

0 10

11/1

01/1 01

c1c2/a1

State C:(1 0)

00/0 00/1 11/0 11/1 10/0 10/1

State D:(1 1)

01/0 01/1

State A:(0 0) State B:(0 1)

140

Training material of NHK-CTI

Trellis diagram (8) :1bit delay

+ 1 Input

0

+ :Adder

1 0 1 a1

State (a3 a2)

a2 +

10 01/0

11/0

00/0 00 10/0 00/1 11 10/1

a3 0 00

Output c1c2

0

c1=a1+a2 +a3 c2=a1 +a3 1 11

0 01

1 00

0

11/1

01/1 01

0

c1c2/a1

State C:(1 0)

00/0 00/1 11/0 11/1 10/0 10/1

State D:(1 1)

01/0 01/1

State A:(0 0) State B:(0 1)

141

Training material of NHK-CTI

Trellis diagram (9) :1bit delay

+ Input

c1

+ :Adder

1 1 0 a1

State (a3 a2)

a2 +

10 01/0

11/0

00/0 00 10/0 00/1 11 10/1

a3

Output c1c2

c2

0 00

c1=a1+a2 +a3 c2=a1 +a3 1 11

0 01

1 00

11/1

01/1 01

c1c2/a1

State C:(1 0)

00/0 00/1 11/0 11/1 10/0 10/1

State D:(1 1)

01/0 01/1

State A:(0 0) State B:(0 1)

142

Training material of NHK-CTI

Trellis diagram (10) :1bit delay

+ Input

0

+ :Adder

1 1 0 a1

State (a3 a2)

a2 +

10 01/0

11/0

00/0 00 10/0 00/1 11 10/1

a3 0 00

Output c1c2

1

c1=a1+a2 +a3 c2=a1 +a3 1 11

0 01

1 00

0 1 1 01

11/1

01/1 01

c1c2/a1

State C:(1 0)

00/0 00/1 11/0 11/1 10/0 10/1

State D:(1 1)

01/0 01/1

State A:(0 0) State B:(0 1)

143

Training material of NHK-CTI

Trellis diagram (11) + Input

c1

:1bit delay + :Adder

1 1 a1

State (a3 a2)

a2 +

10 01/0

11/0

00/0 00 10/0 00/1 11 10/1

a3

Output c1c2

c2

0 00

11/1

c1=a1+a2 +a3 c2=a1 +a3 1 11

0 01

1 00

01/1 01

1 01

c1c2/a1

State C:(1 0)

00/0 00/1 11/0 11/1 10/0 10/1

State D:(1 1)

01/0 01/1

State A:(0 0) State B:(0 1)

144

Training material of NHK-CTI

Error correction by Viterbi decoding Information message 0

1

0

1

1

0

0

0

Transmitted message 00

11

10

00

01

01

11

00

Received message

10

11

11

00

01

01

11

00

1 (1)

2 (3)

(5)

State A:(0 0) 1 (1)

(2) 0 (1)

(3)

State B:(0 1) 1 (2)

(4) (2)

State C:(1 0)

(3) 1 (2)

State D:(1 1)

(3)

1 (2)

Blanch metric

(2)

Path metric

c1c2/a1 00/0 00/1 11/0 11/1

10/0 10/1 01/0 01/1

145

Training material of NHK-CTI

Error correction by Viterbi decoding Information message 0

1

0

1

1

0

0

0

Transmitted message 00

11

10

00

01

01

11

00

Received message

11

11

00

01

01

11

00

State A:(0 0)

10 (1)

(2) (4)

(3) (2)

(1)

(1)

(4)

(3)

State B:(0 1)

(2) (2)

(4)

(2)

State C:(1 0) (2)

State D:(1 1)

c1c2/a1

(3) (2)

(4) (3)

00/0 00/1 11/0 11/1

10/0 10/1 01/0 01/1

146

Training material of NHK-CTI

Error correction by Viterbi decoding Information message 0

1

0

1

1

0

0

0

Transmitted message 00

11

10

00

01

01

11

00

Received message

11

11

00

01

01

11

00

10 (1)

(3)

State A:(0 0)

(2) (2)

(1)

(1)

(3)

(3)

State B:(0 1)

(2) (2)

(2)

State C:(1 0) (2)

State D:(1 1)

(3) (3)

c1c2/a1

(3) (4)

00/0 00/1 11/0 11/1

(4) (4) (3) (2) (5)

10/0 10/1 01/0 01/1

147

Training material of NHK-CTI

Error correction by Viterbi decoding Information message 0

1

0

1

1

0

0

0

Transmitted message 00

11

10

00

01

01

11

00

Received message

11

11

00

01

01

11

00

10 (1)

(2)

State A:(0 0)

(3)

(2) (1)

(4) (4)

(1)

(3)

State B:(0 1)

(4)

(2) (2)

(4) (5)

(2)

State C:(1 0) (3) (2)

State D:(1 1)

(3) (2)

(2) (3)

(3)

c1c2/a1

00/0 00/1 11/0 11/1

(4)

10/0 10/1 01/0 01/1

148

Training material of NHK-CTI

Error correction by Viterbi decoding Information message 0

1

0

1

1

0

0

0

Transmitted message 00

11

10

00

01

01

11

00

Received message

11

11

00

01

01

11

00

10 (1)

(2)

State A:(0 0)

(2) (1)

(4)

(1)

(6) (2)

(3)

State B:(0 1)

(4)

(2) (2)

(4)

(2)

(4) (5)

State C:(1 0) (3)

(2)

(2)

State D:(1 1)

(2) (3)

(3)

c1c2/a1

00/0 00/1 11/0 11/1

(4) (5) (4)

10/0 10/1 01/0 01/1

149

Training material of NHK-CTI

Error correction by Viterbi decoding Information message 0

1

0

1

1

0

0

0

Transmitted message 00

11

10

00

01

01

11

00

Received message

11

11

00

01

01

11

00

10 (1)

(2)

State A:(0 0)

(2) (1)

(2)

(1)

(3)

State B:(0 1)

(2) (2)

(2)

(4)

(2) (6) (4) (4) (5)

State C:(1 0) (2) (2)

State D:(1 1)

(4)

(5)

(3) (4)

c1c2/a1

00/0 00/1 11/0 11/1

(5)

(5)

10/0 10/1 01/0 01/1

150

Training material of NHK-CTI

Error correction by Viterbi decoding Information message 0

1

0

1

1

0

0

0

Transmitted message 00

11

10

00

01

01

11

00

Received message

11

11

00

01

01

11

00

10 (1)

(2)

State A:(0 0)

(2) (1)

(2) (3)

(1)

State B:(0 1)

(2) (2)

(2) (6) (4)

(4)

(2)

(5)

State C:(1 0) (2) (2)

State D:(1 1)

(4) (5)

(3) (4)

c1c2/a1

00/0 00/1 11/0 11/1

10/0 10/1 01/0 01/1

151

Training material of NHK-CTI

Error correction by Viterbi decoding Information message 0

1

0

1

1

0

0

0

Transmitted message 00

11

10

00

01

01

11

00

Received message

11

11

00

01

01

11

00

10

00

01

01

11

00

10

Error Corrected message State A:(0 0)

00

11

(2)

(1) (2) (1)

State B:(0 1)

(2) (2)

State C:(1 0) (2)

State D:(1 1)

(2)

152

Training material of NHK-CTI

Appendix Concept of error correction

Training material of NHK-CTI

153

Training material of NHK-CTI

Relationship between error correction, detection and minimum Hamming distance Corrected area

Cn=(x1,x2,・・,xn)

C3

dmin≧2t*+s*+1 Possible to correct less and equal t*bits Possible to detect less and equal than (t*+s*) bits

t*

s*

1

C1

t* C2

dmin

Codeword Invalid codeword

Detected area

154

Training material of NHK-CTI

Example error correction and detection dmin=5

t* C1

Error corrected area

t*=2、s*=0

t* C2

dmin t* s*

t*=1、s*=2

t* C2

C1

Possible to correct and detect less and equal than 2bits errors

Possible to correct 1bit error and detect less and equal than 3bits errors

dmin

Error detected area

t*=0、s*=4

s* C1

dmin

C2

Impossible to correct. Possible to detect less and equal than 4bits errors 155

Related Documents