Sunteți pe pagina 1din 5

1

EE302 Lesson 22

Sources: (1) Course materials developed by CDR Hewitt Hymas, USN
(2) Frenzel, Principles of Electronic Communication Systems, 3
rd
ed., McGraw Hill, 2008


a. Be familiar with binary digital codes such as the ASCII code and state how they are used to represent alphanumeric symbols.
b. Calculate the data rate for a serial transmission.
c. State the difference between a bit and a symbol.
d. Explain the difference between bit rate and baud rate.
e. Explain the difference between asynchronous and synchronous data transmission.
f. State the relationship between communication channel bandwidth and data rate on noise-free channels (Hartleys law) and on
channels with noise (Shannon-Hartley theorem).
g. Calculate channel capacity using Hartleys Law and the Shannon-Hartley Theorem


Digital Codes We know binary codes can be used to represent numbers. Now we will consider the representation
of alphanumeric symbols. Alphanumeric symbols consist of both letters and numbers and often other symbols
(such as punctuation marks and mathematical symbols).

The earliest digital code was the Morse Code, which used dots and dashes to represent 1s and 0s.


Binary representation of alphanumeric symbols (letters, numbers, punctuation, etc.) is given by the American
Standard Code of Information Interchange (ASCII) code. Each ASCII code word is 7 bits long, allowing
7
2 128 = possible alphanumeric characters to be represented. ASCII has remained the international standard in
data communications. The ASCII code is shown on the next page.

Binary Coding An n-bit binary code can represent 2
n
possible symbols. So, for instance, to represent the four
items { red, green, blue, white } we would need 2 bits.

Serial Transmission To transmit the ASCII letter M ( 1001101 ), we send it one bit at a time, starting with the
LSB and ending with the MSB.






If the time required to transmit 1 bit is 10 ms, then it would require 70 ms to transmit all 7bits.

The speed of data transfer is specified in bits per second (or bps). The time required to transmit 1 bit (bit time t) is
the inverse of the bit rate. For example, if the bit rate is 9600 bps, the time required to transmit one bit is:


1
9600 bps 104.17 per bit
9600
t s = =
2


BAUD Rate Baud rate specifies the number of symbols per second transmitted. A symbol is a state of the
transmitter and can represent multiple bits of information. Consider a 4-level signal below the transmitter can
be in one of four states: 0 V, 1 V, 2 V, or 3 V.



Baud rate =
1 1
symbol period 1ms
= =1000 symbols/sec
3

The bit rate in this case does not equal the baud rate! In the above example, each symbol represents 4
possible states, thus:


where S = number of possible states per symbol.

The baud rate in the example above is 1000 symbols/sec, thus the bit rate is:

bit rate = baud rate bits per symbol = 1000 log
2
(4) =2000 bps

Asynchronous Transmission In asynchronous transmission each data word is accompanied by start and stop
bits that indicate the beginning and ending of the word.

When no information is being transmitted, the communication line is usually high, or binary 1 (Binary 1 is also
called a mark).

The start bit, a binary 0 (called a space) signals the beginning of a word.





Most low-speed digital transmission (the 1200- to 56,000-bps range) is
asynchronous. Asynchronous transmissions are extremely reliable.
However, the overhead (extra bits) associated with the start and stop bits
represent 2 of every 9 bits transmitted (22%) thus reduce the efficiency.

Synchronous Transmission In synchronous transmission each data word is sent one after another without
start/stop bits. Synchronization is maintained between transmitter and receiver using a group of synchronization
bits placed at the beginning/end of the block. Each block of data can represent hundreds or even thousands of
1-byte characters.






Transmission Efficiency Transmission efficiency is the accuracy and speed with which information, whether it
is voice or video, analog or digital, is sent and received over communication media.

Information and Bandwidth The amount of information that can be transmitted on a channel is a function of the
bandwidth of the channel. The greater the bandwidth of a channel, the greater the amount of information that can
be transmitted in a given time. Some example bandwidths required for transmission:

3 kHz for voice transmissions
15-20 kHz for high-fidelity music
6 MHz for analog television

Is it possible to transmit the same amount of information over a channel with a narrower bandwidth?

Yes, but it must be done over a longer period of time.

The higher the bit rate, the wider the bandwidth needed to pass the signal with minimum distortion. Think of
bandwidth like a water pipe the wider the pipe, the more water that can be moved.

Channel capacity (C) refers to the information capacity of a channel measured in bits per second.
start bit stop bit

2 2
bits per symbol log ( ) log (4) 2 bits/symbol S = = =
4

Transmission Media and Bandwidth The two most common types of media used in data communication are
wire cable and radio. The two types of wire cable used:

Coaxial cable: usable bandwidth 200 MHz - 3 GHz depending on the size.
Twisted-pair cable: usable bandwidth 2 KHz-100 MHz.

All wire cables act as low-pass filters, attenuating the high frequency components of the rectangular signal.


Hartleys Law The channel capacity C expressed in bits per second using binary signaling is twice the channel
bandwidth B in Hz.



This capacity assumes noise-free transmission. B is usually the upper cutoff frequency (- 3 dB point) of the
channel. For example, the maximum capacity of a 10-kHz channel is



Channel capacity can be modified by using multiple-level encoding schemes to increase the number of bits per
symbol. The channel capacity for N levels per symbol is given by:


Thus, if B = 10 kHz and we are using a 4-level code, the maximum channel capacity is:



Hartleys Law above presumes the channel has no noise. What if the channel has noise?

Hartley-Shannon Theorem The Hartley-Shannon Theorem states that when noise is considered, the maximum
channel capacity channel capacity is given by:


2
log 1
S
C B
N

= +



where C is the channel capacity in bps as before, B is the bandwidth in Hz, and
S
N
is the ratio of signal power to
noise power as a ratio, i.e., NOT in dB.


Example Problem #1. How many binary numbers are required to represent the 26 character alphabet?






2 C B =
2 2(10,000) 20,000 bps C B = = =
2
2 log C B N =
2
2 log 4 2(10,000)(2) 40,000 bps C B = = =
5

Example Problem #2 Consider an 8-level signal shown below.
a. What is the baud rate?
b. How many bits are represented by each symbol?
c. What is the resulting bit rate?






Example Problem #3 A block of 256 sequential 12-bit data words is transmitted serially in 0.016 sec. Calculate:

(a) the time duration of 1 word
(b) the time duration of one bit
(c) the speed of transmission in bps.







Example Problem #4 Calculate the channel capacity for a voice-grade telephone line with a bandwidth of 3100 Hz
if binary signaling is used.






Example Problem #5 Calculate the maximum channel capacity for a voice-grade telephone line with a bandwidth
of 3100 Hz and a S/N ratio of 30 dB.

S-ar putea să vă placă și