释义 |
information entropy
information entropyn. In information theory, a mathematical measure of the degree of randomness in a set of data, with greater randomness implying higher entropy and greater predictability implying lower entropy. Also called Shannon entropy. |