Home / Dictionary / Information entropy

Information entropy Uncommon

Definition, synonyms and related words

Definitions
Noun
1

A measure of the uncertainty associated with a random variable; a measure of the average information content one is missing when one does not know the value of the random variable (usually in units such as bits); the amount of information (measured in, say, bits) contained per average instance of a character in a stream of characters.

Related Terms
Rhyming Words
copy mopy ropy dopy tropy otopy atopy scopy roopy gropy poopy slopy goopy loopy canopy recopy wicopy droopy ectopy snoopy
Compare
Information entropy vs