0 Voturi pozitive0 Voturi negative

2 (de) vizualizări18 paginijuubub

Dec 05, 2018

© © All Rights Reserved

PDF, TXT sau citiți online pe Scribd

juubub

© All Rights Reserved

2 (de) vizualizări

juubub

© All Rights Reserved

- L1671 Lcd Datasheet
- Information Theory and Coding
- Cctalk Example
- Miller_1951_language.pdf
- IT2302-Information Theory and Coding
- Lecture 2-Source Coding
- Shannon- Two Way Communication Channels
- Entropy Coding Technique for Compression of Satellite Vibration Test Data
- Main
- Image Compression
- Variable Length Codes
- Image Processing Qb
- Z24 - 7 - Image Compression
- Realitymining Old
- Mil Guide
- JPEG Image Compression and Decompression by Huffman Coding
- ITC_mod-4_Ktunotes.in_.pdf
- Chaos Analysis
- eidma
- EPLAN P8 ºº»¯µÄ×Öµä_ÈçÓÐ¸Ä¶¯ÇëÁªÏµÎÒ

Sunteți pe pagina 1din 18

Modulator

· Digital Source Channel

Source Symbols Encoder Binary Encoder

Source sequence

Entropy

Source coding reduces redundancy and hence reduces

bandwidth requirement.

knowledge of the statistics of the source.

probable than others, shorter codewords should be assigned

to these symbols D.1

1) Frequent sym bols Sym bols are equally probable

assign short code w ord

2) Rare sym bols to happen and/or statistically

assign long code word independent

V ariable-length Code

(M orse Code)

probability

(eg. A lphabet in the English

Language)

U se V ariable length fixed block length

C ode ie. 4 sym bols A B C D

A Æ 00

B Æ01

C Æ10

Code according to the probability of D Æ11

occurrence of the source sym bols

D.2

EN TR O PY C O D IN G

1

Shannon’s source coding theorem

Given a discrete memoryless source of entropy H(X), the

average codeword length Lav for any distortionless source

coding scheme is bounded as

Lav ≥ H ( X )

represents a fundamental limit on the average number of bits

per source symbol necessary to represent a discrete

memoryless source in that it can be made as small as, but no

smaller than the source entropy.

D.3

Source Efficiency

η ≤1

At the receiver,

Channel

Binary Decoder

Symbols

sequence

to one mapping

D.4

2

Example 1

Source symbols

Discrete P ( A) = 1 /2

Memory

Source A, B, C, D P( B) = 1/ 4

(DMS) P (C ) = 1 / 8

P( D) = 1 / 8

The symbols are coded in 3 different ways using variable-

length code as below.

D.5

Example 1

Symbol of Occurence

A 1/2 1 0 0

B 1/4 00 10 01

D.6

3

Code I

There is a problem in this decoding process.

Is this B,A,B,A,…… or B,D,C,………?

additional bits. But then the decoding is not instantaneous

and not desirable in practice.

D.7

Code II

Uniquely decodable and instantaneous.

A B C

D code II

Initial state

1 1 1

D.8

4

Code III

Uniquely decodable but not instantaneously decodable.

0

D

1 1 1

D.9

Instantaneously decodable

Question: How can we have codes that are instantaneously

decodable?

Answer: No code word in the code is a prefix of another code

word.

C l = [b1 b2 b3 ⋅ ⋅ ⋅ bl ] is a prefix of C k l < k

is no other code C l having the same l elements

[b1 b2 b3 ⋅ ⋅ ⋅ bl ] .

D.10

5

Kraft inequality

A prefix code always satisfy the Kraft inequality

L

∑2

k =1

− nk

≤1

codewords.

prefix code. Rather, it is merely a condition on the codeword

lengths of the code and not on the code words themselves.

D.11

Example

Code I violates the Kraft inequality; it cannot be a prefix

code.

code II is a prefix code.

D.12

6

Average code word length

H ( X ) ≤ Lav ≤ H ( X ) + 1

⇒ H ( X n ) ≤ Lnav ≤ H ( X n ) + 1

⇒ nH ( X ) ≤ Lnav ≤ nH ( X ) + 1

⇒ H ( X ) ≤ Lnav ≤ H ( X ) + 1 / n

1

lim Lnav = H ( X )

n →∞ n

code can be make as small as the entropy of the source.

However, the price we have to pay for decreasing the

average code-word length is incrased decoding complexity.

D.13

Shannon-Fano Encoding

Variable-length encoding scheme proposed by Shannon and

Fano and the algorithm is simply stated as below.

and N symbols per message, a unique binary codeword C i of

length li for message i can be generated such that the

average number of bits per symbol LNav is closer to G N where

1

GN = −

N

∑ P(m ) log

i

i 2 P ( mi ) as before

D.14

7

M

1

In other words , Lav =

N

∑l P → G

i =1

i i N

2 X X X….X C2 (length l2)

M possible

. Encoding .

message of

. .

N symbols long

. .

M X X X….X CM (length lM)

message has a unique codeword and can be decoded

uniquely.

D.15

by

G N ≤ Lav < G N + 1 / N

H

η= H: Source Entropy

Lav

D.16

8

Coding: Step 1

Arrange the N messages in order of decreasing probability

and let

i −1

Fi = ∑ Pk with F1 = 0

k =1

P1 F1

P2 F

i.e. → 2

: :

PN FN

D.17

Step 2

Calculate the number of bits li required for message i by the

following equation

− log 2 Pi ≤ li < 1 − log 2 Pi

For example, Pi = 1 / 5 ,

∴ li = 3

D.18

9

Step 3-4

Convert Fi into a binary fraction. The binary fraction

satisfies the following equality

b1 b2 b

b1b2 b3 ⋅ ⋅ ⋅ bk = + 2 + ... + kk

2 2 2

For example, 27 0 0 1 1 0 1 1

=

+ + + + + +

128 2 4 8 16 32 64 128

0 0 1 1 0 1 1

= + 2 + 3+ 4 + 5 + 6 + 7

2 2 2 2 2 2 2

→ 0011011

Step 4

The code word for message i is the truncated version of the

fraction Fi and has length li .

D.19

Example 2

1 AAA 27

128 0 3 0000000 000

2 BBB 27 27

128 128 3 0011011 001

3 CAA 9 54

128 128 4 0110110 0110

D.20

10

Example 3

Consider a Markov source encoder with N =2 symbols per

message per message. The encoding operation is

i Symbols Pi li Codeword Fi Binary Fi

1 AA 9

2 00

32

2 BB 9

2 01

32

3 AC 9

4 1001

32

4 CB 3

4 1010

32

5 BC 3

4 1100

32

6 CA 3

4 1101

32

7 CC 2

4 1111

32 D.21

Example 3

H N = 1.44bit / symbol

CA,AA,BB,BC 1101,00,01,1100

Source Encoder

matching the incoming data with 1101 00 10 1100

the codewords having the least

number of bits until there is no

AA

match. The incoming data is the

matched with the codewords No Match

having the second smallest

number o bits, then so on.

CA D.22

11

Huffman encoding

Variable-length scheme for the encoding of symbols.

– Arrange the source symbols in descending order of

probability

– Create a new source with one less symbol by combining

(adding) the two symbols having the lowest probability.

– Repeat step 1 and 2 until a single -symbol source is

achieved.

– Associate '1' and a '0' with each pair of probabilities so

combined.

– Encode each original source symbol into binary sequence

generated by the various combinations, with the first

combination as the least significant digit in thee sequence.

D.23

Example 4

Source symbols A,B,C,D,E and probabilities are P(A)=0.4,

P(B) =0.2, P(C) =0.2, P(D)=0.1, P(E)=0.1

A 0.4 Æ 0.4 0.4

B 0.2 Æ 0.2

C 0.2 Æ 0.2 0

0.4 0

D 0.1 0 0.2 1 0.6 0

Æ

E 0.1 1 0.2 1 0.4 1Æ 1.0

C encodes into 000 D encodes into 0010

E encodes into 0011

D.24

12

Example 4

decodable; i.e. a stream of encoded symbols can decoded

without the need for synchronization or forming pulses

between the binary sequences representing each symbol. The

tree diagram confirms this property, since no encoded

sequence is a prefix to another/any other valid sequence.

Example:

The encoded sequence for symbol C is not a prefix of any

other valid sequence.

D.25

Example 4

Tree Diagram E

A B 1

D

1 1 1 0

Initial state C

0 0 0

m

Source Entropy H ( X ) = −∑ Pi log 2 ( Pi )

i =1

= −0.4 log 2 0.4 − 2[0.2 log 2 0.2] − 2[0.1log 2 0.1]

= 2.12 bits/symbol

D.26

13

Example 4

If P(A) = P(B) = P(C) = P(D) = P(E), H ( X ) = H ( X ) max

= 2.322 bits/symbol

H(X )

Source efficiency before encoding = = 0.913

H ( X ) MAX

D.27

Example 4

The average number Lav of digits used to encode each source

symbol after encoding is

0.4(1) + 0.2(2) + 0.2(3) +0.1(4)+0.1(4) = 2.2 bit/symbol

Source efficiency η after encoding is η=H(X)/Lav=0.964

M: number of binary digits required to represent the source

without any source coding

Lav/M: compression ratio CR.

In our example, M=3 and CR = 2.2/3 = 0.73

CR indicates the proportion by which the bandwidth

required to transmit the source over a binary channel can be

reduced. D.28

14

Run-length Coding

Run length coding technique is particularly suitable for

binary sources where one of the symbols (the '1' say) occurs

very much less often then others, so that there are long runs

of successive '0's (e.g. a scanned and digitized line drawing).

In this case, it is more efficient to encode by counting the

number of the consecutive '0's between '1's.

D.29

Example 5

1 000

01 001

001 010

0001 Source 011

00001 Encoder 100

000001 101

0000001 110

0000000 111

001,0000000,1,1,0000001

This sequence is encoded into a 15-digit sequence

010,111,000,000,110.

D.30

15

Example 5

The compression in this particular case is 15/20 = 0.75

There are schemes where both the input block length and

output block length are variable length. (FAX)

D.31

Lempel-Ziv Coding

In practice, source statistics are not always known a priori.

used.

parsing the source data stream into segments that are the

shortest subsequences not encountered previously.

D.32

16

Example 6

Consider 000101 110010 100101…

Assumed that the binary symbols 0 and 1 are already stored

in that order in the code book, We can write

Subsequences stored: 0, 1

Data to be parsed: 000101 110010 100101…

1 already stored, the shortest subsequence of the data stream

encountered for the first time and not seem before is 00, so

we write

Subsequences stored: 0, 1, 00

Data to be parsed: 0101 110010 100101… D.33

Example 6

Similarly, we have

Subsequences stored: 0, 1, 00, 01

Data to be parsed: 01 110010 100101…

Data to be parsed: 10010 100101…

been completely parsed. Thus, we get the code book as

follows:

D.34

17

Example 6

Numerical Subsequences Numerical Binary

Position Representation encoded

blocks

1 0

2 1

3 00 11 0010

4 01 12 0011

5 011 42 1001

6 10 21 0100

7 010 41 1000

8 100 61 1100

9 101 62 1101 D.35

Lempel-Ziv coding

Lempel-Ziv algorithm uses fixed-length codes to represent a

variable number of source symbol; this feature makes the

Lempel-Ziv code suitable for synchronous transmission.

implies a code book of 4096 entries.

D.36

18

- L1671 Lcd DatasheetÎncărcat deStavros Kalapothas
- Information Theory and CodingÎncărcat deRavi_Teja_4497
- Cctalk ExampleÎncărcat deMajid Hajric
- Miller_1951_language.pdfÎncărcat deShishuvan Durgapur
- IT2302-Information Theory and CodingÎncărcat deRakesh Inani
- Lecture 2-Source CodingÎncărcat deMayam Ayo
- Shannon- Two Way Communication ChannelsÎncărcat dearashbehboodi
- Entropy Coding Technique for Compression of Satellite Vibration Test DataÎncărcat deInternational Journal of Emerging Trends in Signal Processing (IJETSP)
- MainÎncărcat deArtur Oganovskij
- Image CompressionÎncărcat deAmit Kumar
- Variable Length CodesÎncărcat deignatiusnicky
- Image Processing QbÎncărcat desubramanyam62
- Z24 - 7 - Image CompressionÎncărcat deJeJen Zaelani
- Realitymining OldÎncărcat deluizzzf
- Mil GuideÎncărcat demehdi_nasrabadi
- JPEG Image Compression and Decompression by Huffman CodingÎncărcat deInternational Journal of Innovative Science and Research Technology
- ITC_mod-4_Ktunotes.in_.pdfÎncărcat deSivakeerthi Santhosh
- Chaos AnalysisÎncărcat deDivya Lekha
- eidmaÎncărcat dePrem251
- EPLAN P8 ºº»¯µÄ×Öµä_ÈçÓÐ¸Ä¶¯ÇëÁªÏµÎÒÎncărcat delogickalai09
- Computer ViaÎncărcat deSayed Asif
- CpE_10Încărcat deYanyan Jalea
- lenze modbusÎncărcat deHazar çintuğlu
- Serial Concatenation Rate OneÎncărcat dedk1987
- Self-checking Approach for Reducing Soft Errors in States of FsmÎncărcat deInternational Journal of Research in Engineering and Technology
- Com1006 1 Digital Systems 1upÎncărcat deAnonymous QtiDkhu
- Subnetting ShortcutsÎncărcat debala05841415
- Ling MultiplierÎncărcat degmakkena
- Memory Cell in ComputerÎncărcat deRAMU
- el30xxen (1)Încărcat derahulsharma143

- Introduction to PythonÎncărcat deMichel Pierre
- Comptutional MethodsÎncărcat detsegay
- What is the Difference in _Boot With BIOS_ and _Boot With UEFI_ - Super UserÎncărcat detsegay
- Arduino Circuits and Projects Guide - ElektorÎncărcat detsegay
- pulse code modulation samplingÎncărcat deMulk Raj
- 07253726Încărcat detsegay
- C++ Inline FunctionsÎncărcat detsegay
- Digital Communication Data Encoding TechniquesÎncărcat detsegay
- What is the difference between voice recognition and speech recognition, if any_ - Quora.pdfÎncărcat detsegay
- data.pdfÎncărcat detsegay
- Department of electronics and communication engineering group assimment.pdfÎncărcat detsegay
- Chapter 2 FunctionsÎncărcat detsegay
- Chapter 1 IntroductionÎncărcat detsegay
- Presentation 1Încărcat detsegay
- Chap_FiveÎncărcat detsegay
- Ch 2 FunctionÎncărcat detsegay
- Arduino Color Sorter Project - HowToMechatronicsÎncărcat detsegay
- Arduino Color Sorter Project - HowToMechatronicsÎncărcat detsegay
- Arduino Circuits and Projects Guide - Elektor.pdfÎncărcat dedecker
- HSK Vocabulaire en Karakters Niveau 3Încărcat deAbhishiktaAbhi
- Beg Unit 004 Lesson 01 LNÎncărcat detsegay
- Dale Carnegie How to Win Friends and Influence PeopleÎncărcat detsegay
- ECE306 - DSP Assignment 1.pdfÎncărcat detsegay
- Ultraloq UL3 BT User Guide V1.5Încărcat detsegay
- Quectel M10 at CommandsÎncărcat dejo

- CE_UNIT-3 Print -2016.docÎncărcat demsrai
- Analysis of the Distribution of the Number of ErasuresÎncărcat deHammaddilpazir
- 4 Error Detection&CorrectionÎncărcat deShahzebKhurshid
- 20130814 LDPC Tutorial Mod1Încărcat deJerry
- Coding Theory and TechniquesÎncărcat deIbmWasuser
- ECSS-E-ST-50-01C(31July2008)Încărcat dejsadachi
- Chapter 4Încărcat deVish Potnis
- www.uotechnology.edu.iq_dep-eee_lectures_4th_Communication_Information theory_5.pdfÎncărcat demylinhtcnh1993
- CONVOLUTIONAL CODESÎncărcat dePiyush Mittal
- Reviewer in Dcn_chap10Încărcat deJane Caverte
- Mtech 2nd Sem SyllabusÎncărcat deDivya Santoshi
- Part 2Încărcat deopenid_ZufDFRTu
- Top StuffÎncărcat deDonald White
- Genetic Algorithms for Soft Decision Decoding of Linear Block CodeÎncărcat deshtompel07
- 86594495-Image-Compression-Fundamentals.pdfÎncărcat deMohan Reddy Kandi
- Introduction_to_Coding_Theory_RonRoth.pdfÎncărcat deRS Sriram
- Operating Manual TPS64 C33258_20_C0Încărcat deraedabbadi
- studi kasus reed solomon code for WiMAXÎncărcat deNuansa Dipa Bismoko
- Wtmc Mca Sem5 NotesÎncărcat deaniketrane69
- Citation/Export MLA P.Lavanya , N.S.Murti Sarma, Ch.S.V.Maruthi Rao, “An Elementary Proposal on Fault Tolerant Devices for Memory Scenario”, January 17 Volume 5 Issue 1 , International Journal on Recent and Innovation Trends in Computing and Communication (IJRITCC), ISSN: 2321-8169, PP: 56 - 61 APA P.Lavanya , N.S.Murti Sarma, Ch.S.V.Maruthi Rao, January 17 Volume 5 Issue 1, “An Elementary Proposal on Fault Tolerant Devices for Memory Scenario”, International Journal on Recent and Innovation Trends in Computing and Communication (IJRITCC), ISSN: 2321-8169, PP: 56 - 61Încărcat deEditor IJRITCC
- lphc codeÎncărcat deSimadri Badatya
- LDPC Encoding and Decoding for High Memory and DSP ApplicationsÎncărcat dekalyan.427656
- Data Link LayerÎncărcat deVeena Gadad
- JKKNCET-Digital Communication Laboratory-MATLAB ProgramsÎncărcat deJULIAN SAVARI ANTONY S
- dc09_showÎncărcat detolga17
- CC 3 LinearBlockCodes (1)Încărcat deKanava Pitchai
- AU680 AU480 Instrument Online Specification Jan1-2011v9Încărcat dePhat Luong
- Cyclic CodesÎncărcat dexxtcxx
- On Iterative Decoding Schemes for Communication over Wireless ChannelsÎncărcat deSaikat Majumder
- Report 4thÎncărcat delinhmta

## Mult mai mult decât documente.

Descoperiți tot ce are Scribd de oferit, inclusiv cărți și cărți audio de la editori majori.

Anulați oricând.