0 evaluări0% au considerat acest document util (0 voturi)
91 vizualizări5 pagini
Claude Shannon's 1948 paper "A Mathematical Theory of Communication" gave birth to the twin disciplines of information theory and coding theory. Shannon's colleague Richard Hamming had been laboring on error-correction for early computers. Coding theory attempts to realize the promise of these bounds by models which are constructed through mainly algebraic means.
Claude Shannon's 1948 paper "A Mathematical Theory of Communication" gave birth to the twin disciplines of information theory and coding theory. Shannon's colleague Richard Hamming had been laboring on error-correction for early computers. Coding theory attempts to realize the promise of these bounds by models which are constructed through mainly algebraic means.
Drepturi de autor:
Attribution Non-Commercial (BY-NC)
Formate disponibile
Descărcați ca DOC, PDF, TXT sau citiți online pe Scribd
Claude Shannon's 1948 paper "A Mathematical Theory of Communication" gave birth to the twin disciplines of information theory and coding theory. Shannon's colleague Richard Hamming had been laboring on error-correction for early computers. Coding theory attempts to realize the promise of these bounds by models which are constructed through mainly algebraic means.
Drepturi de autor:
Attribution Non-Commercial (BY-NC)
Formate disponibile
Descărcați ca DOC, PDF, TXT sau citiți online pe Scribd
components Electronics project report By-Abhishek shukla 06MS06 Group-A This is the era of information technology in which we are living. We share a lot of information amongst each other by many means which are called information channels . Claude Shannon’s 1948 paper “A Mathematical Theory of Communication” gave birth to the twin disciplines of information theory and coding theory. The basic goal is efficient and reliable communication in an uncooperative (and possibly hostile) environment. To be efficient, the transfer of information must not require a prohibitive amount of time and effort. To be reliable, the received data stream must resemble the transmitted stream to within narrow tolerances. These two desires will always be at odds, and our fundamental problem is to reconcile them as best we can. At an early stage the mathematical study of such questions broke into the two broad areas. Information theory is the study of achievable bounds for communication and is largely probabilistic and analytic in nature. Coding theory then attempts to realize the promise of these bounds by models which are constructed through mainly algebraic means. Shannon was primarily interested in the information theory. Shannon’s colleague Richard Hamming had been laboring on error-correction for early computers even before Shannon’s 1948 paper, and he made some of the first breakthroughs of coding theory. Information passes from a source to a sink via a conduit or channel. In our view of communication we are allowed to choose exactly the way information is structured at the source and the way it is handled at the sink, but the behaviour of the channel is not in general under our control. The unreliable channel may take many forms. We may communicate through space, such as talking across a noisy room, or through time, such as writing a book to be read many years later. The uncertainties of the channel, whatever it is, allow the possibility that the information will be damaged or distorted in passage. My conversation may be drowned out or my manuscript weather. Shannon, Hamming, and many of the other originators of mathematical communication theory worked for Bell Telephone Laboratories. They were specifically interested in dealing with errors that occur as messages pass across long telephone lines and are corrupted by such things as lightening and crosstalk. The transmission and reception capabilities of many modems are increased by error handling capability embedded in their hardware.
Richard Hamming, a colleague of Shannon’s at Bell Laboratories, found a need for
error correction in his work on computers. Parity checking was already being used to detect errors in the calculations of the relay-based computers of the day, and Hamming realized that a more sophisticated pattern of parity checking allowed the correction of single errors along with the detection of double errors. The codes that Hamming devised, the single-error-correcting binary Hamming codes and their single-error-correcting, double-error-detecting extended versions marked the beginning of coding theory. These codes remain important to this day, for theoretical and practical reasons as well as historical. If we are using messages of length 2 and codes of length 5 we see that of the 5 2 =32 possible sequences of length 5, only 4 of them are code words. This means that if an error is introduced into a code word, and one bit is changed, then it is likely that the new sequence may not be a code word at all. In the 1940s Bell used a slightly more sophisticated m of n code known as the two-out-of five code. This code ensured that every block of five bits (known as a 5-block) had exactly two 1s. The computer could tell if there was an error if in its input there were not exactly two 1s in each block. Two-of-five was still only able to detect single bits; if one bit flipped to a 1 and another to a 0 in the same block, the two-of-five rule remained true and the error would go undiscovered.
MY CIRCUIT- But my circuit is a bit different from the above .I have
transformed symbols of 2 bits to 5 bits but in a different manner which is as below:- 1 0 0 1 1 a 0 1 0 1 0 b 0 0 1 0 1 c d e