Probability Logic

Probability Logic

 

a system of logic in which, in addition to truth and falsity, “intermediate” values of truth (so-called probabilities of the truth of expressions) are assigned to statements (opinions, assertions, and propositions) according to the degree of their plausibility, corroboration, and so forth.

Insofar as the concept of probability has a natural correlation to certain events, and the occurrence or nonoccurrence of an event is a fact that allows it (if only in principle) to be empirically verified (in a broad sense, this includes the so-called thought experiment, as well as a conclusion drawn from knowledge concerning the occurrence or nonoccurrence of other events), probability logic strives to make inductive logic more precise. The reciprocal transitions from the language of statements to the language of events and vice versa are accomplished so naturally that they are regarded as almost trivial: a statement about its occurrence is juxtaposed to each event, and each event is juxtaposed to a statement in such a way that it is found to be true. The specific characteristics of probability logic (even those that have been completely formalized in mathematical logical terms) involve the fundamental irremovability of the incomplete authenticity (“relative truth”) of premises and conclusions, which is essential to all inductive knowledge.

The problems of probability logic were developed in antiquity (for example, by Aristotle) and in modern times by G. W. Leibniz, G. Boole, W. S. Jevons, and J. Venn.

As a logical system, probability logic is a variety of multivalued logic: truthful expressions (authentic events) are assigned a truth value (probability) of 1, and false expressions (impossible events) are given a value of 0. Hypothetical expressions may be assigned values of any real number between 0 and 1. The probability of a hypothesis, which depends on its content (formulation) as well as on the information about knowledge that we already have (“experience”), is their function. The logical operations of conjunction (corresponding to the multiplication of events in probability theory) and disjunction (corresponding to the addition of events) are determined by the truth values (probabilities) of hypotheses. The measure (value) of the negation of a hypothesis is the probability of an event, which consists of its noncorroboration. The values of hypotheses form the so-called standardized Boolean algebra, a comparatively simple and well-developed apparatus that allows probability theory to be easily axiomatized and is the simplest variant of probability logic.

According to another treatment of the concept of probability connected with the so-called frequency concept (definition) of probability (H. Poincaré, M. von Smoluchowski, R. von Mises), ideas were developed in probability logic according to which the principal object of its consideration is not the probabilities of specific events but random processes. These random processes are realized in the simplest instance in the form of random binary sequences—that is, sequences of zeroes and ones (corresponding to single instances of the nonoccurrence and occurrence of a given event under conditions of repeated experiments).

There has also been intensive development of the problems of probability logic that arise when the two approaches mentioned above are compared (R. Carnap, B. Russell, and others), as well as problems based on the connection between theoretical probability concepts and the ideas of information theory and logical semantics. All these trends are being worked out in the perfection of the strictly mathematical apparatus of probability logic and the theoretical cognitive interpretation of the systems that have been developed. (It is precisely in the latter field that the principal difficulties of probability logic are concentrated.)

IU. A. GASTEV