Error-Correcting Codes

Error-Correcting Codes

 

noiseproof codes, codes that detect and correct errors, that is, codes that can, because of redundancy in the code combination, detect and correct errors that result in incorrect or forbidden combinations. Such codes are employed for the transmission and processing of information in computer technology, telegraphy, remote control, and communication technology where a signal may become distorted owing to various types of interference.

The code words of an error-correcting code contain information digits and check digits (symbols). During the process of coding information for transmission, additional symbols, or check digits, are formed from the information digits according to the rules specified by the error-correcting code. When decoding, the check digits are formed again from the received code words in accord with the same rules and then compared with the received check digits; if they do not match, it means that an error has occurred during the transmission. There are codes that detect the fact that the message has been distorted, and there are codes that correct errors, that is, that make it possible to reestablish the original information.

By way of example we will discuss the Hamming code. Suppose it is necessary to transmit the word 1010. When coded it will be represented by 1011010, where the first, second, and fourth symbols are check digits (101 from left to right) and the others are information digits. If an error occurred during the transmission, say a 0 was obtained instead of 1 for the third digit, then during the decoding the check digits will take on the following values: the first (lowest) is a 1, the second is a 1, and the fourth is a 0 (that is, 011). The mismatch between the code combinations for the check digits not only signals the presence of an error but also indicates the location of the distorted digit (Oil, or 3 in the binary code).

The correcting and detecting capability of the codes depends on the code distance d between words, which is numerically equal to the minimal number of errors that can transform one word into another. For example, let there be a code combination 0111100; 0100101; 0010110. The first group (or word) differs from the second in three of its digits, the second from the third in four of its digits, and the first from the third in three. The minimum distance d between these words is three. If three errors occurred in the first word, it could then be transformed into either the second or the third word; during the decoding such an error would not be detected. The maximum number of errors that could be detected in this case is equal to two. If an error occurred in the second digit of the first word, the word obtained would then differ from the second in four of its digits, from the third in two, and from the first in one. In accord with the maximum likelihood method it would be concluded when decoding that the word transmitted was most probably the first. For correct decoding the maximum number of errors in a transmitted word must transform it into a word that differs from the original word by the smallest number of digits. To correct t errors in all the combinations, it is necessary and sufficient that d2t + 1.

Errors can occur in transmitted words either as a result of independent distortions of the digits (in this case Hamming codes are employed) or of distortion in a group of successive digits (codes have been developed for such cases that correct single-burst errors; other codes correct more than one burst). To detect errors during calculations on a digital computer, arithmetic codes have been developed.

REFERENCE

Peterson, W. Kody, ispravliaiushchie oshibki. Moscow, 1964. (Translated from English.)

G. N. ONYKII