information theory
information theory
information theory
informa′tion the`ory
n.
Noun | 1. | information theory - (computer science) a statistical theory dealing with the limits and efficiency of information processing |
单词 | information theory | |||
释义 | information theoryinformation theoryinformation theoryinforma′tion the`oryn.
information theoryinformation theoryorcommunication theory,mathematical theory formulated principally by the American scientist Claude E. ShannonShannon, Claude Elwood,1916–2001, American applied mathematician, b. Gaylord, Michigan. A student of Vannevar Bush at the Massachusetts Institute of Technology (MIT), he was the first to propose the application of symbolic logic to the design of relay circuitry with his ..... Click the link for more information. to explain aspects and problems of information and communication. While the theory is not specific in all respects, it proves the existence of optimum coding schemes without showing how to find them. For example, it succeeds remarkably in outlining the engineering requirements of communication systems and the limitations of such systems. In information theory, the term information is used in a special sense; it is a measure of the freedom of choice with which a message is selected from the set of all possible messages. Information is thus distinct from meaning, since it is entirely possible for a string of nonsense words and a meaningful sentence to be equivalent with respect to information content. Measurement of Information ContentNumerically, information is measured in bits (short for binary digit; see binary systembinary system, Interestingly, the mathematical expression for information content closely resembles the expression for entropyentropy Analysis of the Transfer of Messages through ChannelsA message proceeds along a channel from the source to the receiver; information theory defines for any given channel a limiting capacity or rate at which it can carry information, expressed in bits per second. In general, it is necessary to process, or encode, information from a source before transmitting it through a given channel. For example, a human voice must be encoded before it can be transmitted by telephone. An important theorem of information theory states that if a source with a given entropy feeds information to a channel with a given capacity, and if the source entropy is less than the channel capacity, a code exists for which the frequency of errors may be reduced as low as desired. If the channel capacity is less than the source entropy, no such code exists. The theory further shows that noisenoise, BibliographySee C. E. Shannon and W. Weaver, The Mathematical Theory of Communication (1949); M. Mansuripur, Introduction to Information Theory (1987); J. Gleick, The Information: A History, a Theory, a Flood (2011). Information Theorythe mathematical discipline that studies the processes of storage, transformation, and transmission of information. Information theory is an essential part of cybernetics. At the basis of information theory lies a definite method for measuring the quantity of information contained in given data (“messages”). Information theory proceeds from the idea that the messages designated for retention in a storage device or for transmission over a communication channel are not known in advance with complete certainty. Only the set from which these messages may be selected is known in advance and, at best, how frequently certain of these messages are selected (that is, the probability of the messages). In information theory it is shown that the “uncertainty” encountered in such circumstances admits of a quantitative expression and that precisely this expression (and not the specific nature of the messages themselves) determines the possibility of their storage and transmission. As such a “measure of uncertainty” in information theory one uses the number of binary digits (bits) necessary to record an arbitrary message from a given source. More precisely, one looks at all possible methods for representing the messages by sequences of the symbols 0 and 1 (binary codes) that satisfy two conditions: (a) different sequences correspond to different messages and (b) upon the transcription of a certain sequence of messages into coded form this sequence must be unambiguously recoverable. Then as a measure of the uncertainty one takes the average length of the coded sequence that corresponds to the most economical method of encoding; one binary digit serves as the unit of measurement. For example, let certain messages x1, x2, and x3 appear with probabilities of ½, ⅜, and ⅛, respectively. Any code that is too short, such as x1 = 0, x2 = 1, x3 = 01 is unsuitable since it violates condition (b). Thus, the sequence 01 can denote x1,x2x3 The code x1 = 0, x2 = 10, x3 = 11 satisfies conditions (a) and (b). To it corresponds an average length of a coded sequence equal to It is not hard to see that no other code can give a smaller value, that is, the code indicated is the most economical. In accordance with our choice of a measure for uncertainty, the uncertainty of the given information source should be taken equal to 1.5 binary units. Here it is appropriate to note that “message,” “communication channel,” and other terms are understood very broadly in information theory. Thus, from the viewpoint of information theory, an information source is described by enumerating the set x1, x2, … of possible messages (which can be the words of some language, results of measurements, or television pictures) and their respective probabilities p1, p2p, There is no simple formula expressing the exact minimum H’ of the average number of bits necessary for encoding the messages x1, x2, …, xn through the probabilities p1, p2, … Pn of these messages. However, the specified minimum is not less than the value (where log2a denotes the logarithm of the quantity a to base 2) and may not exceed it by more than one unit. The quantity H (the entropy of the set of messages) possesses simple formal properties, and for all conclusions of information theory that are of an asymptotic character, corresponding to the case H′→ ∞, the difference between H and H′ is absolutely immaterial. Accordingly, the entropy is taken as the measure of the uncertainty of the messages from a given source. In the example above, the entropy is equal to From the viewpoint stated, the entropy of an infinite aggregate, as a rule, turns out to be infinite. Therefore, when applied to an infinite collection it is treated differently: a certain precision level is assigned, and the concept of £-entropy is introduced as the entropy of the information recorded with a precision of e, if the message is a continuous quantity or function (for example, of time). Just as with the concept of entropy, the concept of the amount of information contained in a certain random object (random quantity, random vector, or random function) relative to another is introduced at first for objects with a finite number of possible values. Then the general case is studied with the help of a limiting process. In contrast to entropy, the amount of information, for example, in a certain continuously distributed random variable relative to another continuously distributed variable, very often turns out to be finite. The concept of a communication channel is of an extremely general nature in information theory. In essence, a communication channel is given by specifying a set of “admissible messages” at the “channel input,” a set of “output messages,” and a collection of conditional probabilities for receiving one or another message at the output for a given input message. These conditional probabilities describe the effect of “noise” distorting the transmitted information. “Connecting” any information source to the channel, one may calculate the amount of information contained in the messages at the output relative to that at the input. The upper limit of these amounts of information, taken with all admissible sources, is termed the capacity of the channel. The capacity of a channel is its fundamental information characteristic. Regardless of the effect (possibly strong) of noise in the channel, at a definite ratio of the entropy of the incoming information to the channel capacity, almost error-free transmission is possible with the correct coding. Information theory searches for methods for transmitting information that are optimal with respect to speed and reliability, having established theoretical limits to the quality attainable. Clearly, information theory is of an essentially statistical character; therefore, a significant portion of its mathematical methods is derived from probability theory. The foundations of information theory were laid in 1948–49 by the American scientist C. Shannon. The contribution of the Soviet scientists A. N. Kolmogorov and A. Ia. Khinchin was introduced into its theoretical branches and that of V. A. Kotel’-nikov, A. A. Kharkevich, and others into the branches concerning applications. REFERENCESIaglom, A. M., and I. M. Iaglom. Veroiatnost’ i informatsiia, 2nd ed. Moscow, 1960.Shannon, C. “Statisticheskaia teoriia peredachi elektricheskikh signalov.” In Teoriia peredachi elektricheskikh signalov pri nalichii pomekh: Sb. perevodov. Moscow, 1953. Goldman, S. Teoriia informatsii. Moscow, 1957. (Translated from English.) Teoriia informatsii i ee prilozheniia: Sb. perevodov. Moscow, 1959. Khinchin, A. Ia. “Poniatie entropii v teorii veroiatnostei.” Uspekhi matematicheskikh nauk, 1953, vol. 8, issue 3. Kolmogorov, A. N. Teoriia peredachi informatsii. Moscow, 1956. (Academy of Sciences of the USSR. Session on the scientific problems of the automation of production. Plenary session.) Peterson, W. W. Kody, ispravliaiushchie oshibki. Moscow, 1964. (Translated from English.) IU. V. PROKHOROV information theory[‚in·fər′mā·shən ‚thē·ə·rē]Information theoryA branch of communication theory devoted to problems in coding. A unique feature of information theory is its use of a numerical measure of the amount of information gained when the contents of a message are learned. Information theory relies heavily on the mathematical science of probability. For this reason the term information theory is often applied loosely to other probabilistic studies in communication theory, such as signal detection, random noise, and prediction. See Electrical communications In designing a one-way communication system from the standpoint of information theory, three parts are considered beyond the control of the system designer: (1) the source, which generates messages at the transmitting end of the system, (2) the destination, which ultimately receives the messages, and (3) the channel, consisting of a transmission medium or device for conveying signals from the source to the destination. The source does not usually produce messages in a form acceptable as input by the channel. The transmitting end of the system contains another device, called an encoder, which prepares the source's messages for input to the channel. Similarly the receiving end of the system will contain a decoder to convert the output of the channel into a form that is recognizable by the destination. The encoder and the decoder are the parts to be designed. In radio systems this design is essentially the choice of a modulator and a detector. A source is called discrete if its messages are sequences of elements (letters) taken from an enumerable set of possibilities (alphabet). Thus sources producing integer data or written English are discrete. Sources which are not discrete are called continuous, for example, speech and music sources. The treatment of continuous cases is sometimes simplified by noting that signal of finite bandwidth can be encoded into a discrete sequence of numbers. The output of a channel need not agree with its input. For example, a channel might, for secrecy purposes, contain a cryptographic device to scramble the message. Still, if the output of the channel can be computed knowing just the input message, then the channel is called noiseless. If, however, random agents make the output unpredictable even when the input is known, then the channel is called noisy. See Communications scrambling, Cryptography Many encoders first break the message into a sequence of elementary blocks; next they substitute for each block a representative code, or signal, suitable for input to the channel. Such encoders are called block encoders. For example, telegraph and teletype systems both use block encoders in which the blocks are individual letters. Entire words form the blocks of some commercial cablegram systems. It is generally impossible for a decoder to reconstruct with certainty a message received via a noisy channel. Suitable encoding, however, may make the noise tolerable. Even when the channel is noiseless, a variety of encoding schemes exists and there is a problem of picking a good one. Of all encodings of English letters into dots and dashes, the Continental Morse encoding is nearly the fastest possible one. It achieves its speed by associating short codes with the most common letters. A noiseless binary channel (capable of transmitting two kinds of pulse 0, 1, of the same duration) provides the following example. Suppose one had to encode English text for this channel. A simple encoding might just use 27 different five-digit codes to represent word space (denoted by #), A, B, . . . , Z; say # 00000, A 00001, B 00010, C 00011, . . . , Z 11011. The word #CAB would then be encoded into 00000000110000100010. A similar encoding is used in teletype transmission; however, it places a third kind of pulse at the beginning of each code to help the decoder stay in synchronism with the encoder. information theoryinformation theoryThe study of encoding and transmitting information. From Claude Shannon's 1948 paper, "A Mathematical Theory of Communication," which proposed the use of binary digits for coding information. Shannon said that all information has a "source rate" that can be measured in bits per second and requires a transmission channel with a capacity equal to or greater than the source rate.See information theory information theorytheory[the´ah-re, thēr´e]![]() Health, in this theory, is a continuously changing process that humans participate in co-creating. Health is human becoming. It is not the opposite of disease, nor is it a state that exists. Disease is viewed as a pattern of the human being's interrelationship with the world. Nursing is both science and art. The science is nursing's abstract body of knowledge lived through the art in service to people. Three principles of this theory comprise the abstract knowledge base used to guide nursing research and practice. The principles of structuring meaning multidimensionally, co-creating rhythmical patterns of relating, and co-transcending with the possibles provide the underpinnings for practice and research. There is a particular nursing practice methodology, the only one that evolves directly from a nursing theory. Parse's practice methodology specifies that the nurse be truly present with the person and family illuminating meaning, synchronizing rhythms, and mobilizing transcendence. Persons choose their own patterns of health, reflective of their values. The nurse is there with the person and family as they uncover meanings and make decisions about their life situations. True presence is an unconditional love grounded in the belief that individuals know the way. Parse has also constructed a research methodology congruent with her theory and unique to nursing. Her research methodology offers the researcher the opportunity to study universal lived experiences from the perspective of the people living the experiences. The purpose of her basic research method is to uncover the meaning of lived experiences to enhance the knowledge base of nursing. Parse has contributed to nursing science a theory with congruent practice and research methodologies. Person (other) is defined as an experiencing and perceiving “being in the world,” possessing three spheres; mind, body, and soul. Person is also defined as a living growing gestalt with a unique phenomenal field of subjective reality. The environment includes an objective physical or material world and a spiritual world. Watson defines the world as including all forces in the universe as well as a person's immediate environment. Critical to this definition is the concept of transcendence of the physical world that is bound in time and space, making contact with the emotional and spiritual world by the mind and soul. Health is more than the absence of disease. Health is unity and harmony within the mind, body, and soul and is related to the congruence between the self as perceived and the self as experienced. Nursing is defined as a human science and an activity of art, centered on persons and human health-illness experiences. The goal of nursing is to help persons gain a higher level of harmony within the mind, body and soul. Nursing practice is founded on the human-to-human caring process and a commitment to caring as a moral ideal. The activities of nursing are guided by Watson's ten carative factors, which offer a descriptive topology of interventions. The nursing process is incorporated in these carative factors as “creative problem-solving caring process,” a broad approach to nursing that seeks connections and relations rather than separations. in·for·ma·tion the·o·ryin·for·ma·tion th·e·o·ry(in'fŏr-mā'shŭn thē'ŏr-ē)information theorythe study of the measurement and properties of codes and messages.Information theoryInformation theoryinformation theory
Words related to information theory
|
|||
随便看 |
|
英语词典包含2567994条英英释义在线翻译词条,基本涵盖了全部常用单词的英英翻译及用法,是英语学习的有利工具。