VariousWords beta
Definitions:
Noun
information
selective information
entropy
Definition: (communication theory) a numerical measure of the uncertainty of an outcome; "the signal contained thousands of bits of information"