mutual information

From Wiktionary, the free dictionary
Jump to navigation Jump to search

English[edit]

English Wikipedia has an article on:
Wikipedia

Noun[edit]

mutual information (usually uncountable, plural mutual informations)

  1. (information theory) A measure of the entropic (informational) correlation between two random variables.
    Mutual information between two random variables and is what is left over when their mutual conditional entropies and are subtracted from their joint entropy . It can be given by the formula .
    • 2018, Clarence Green, James Lambert, “Position vectors, homologous chromosomes and gamma rays: Promoting disciplinary literacy through Secondary Phrase Lists”, in English for Specific Purposes, →DOI, page 6:
      From these lists, all combinations of the four major parts of speech were extracted and each phrase was checked for the frequency, dispersion and range criteria respectively. Those that passed the criteria thresholds then had their mutual information scores computed using the tool Collocate (Barlow, 2004) and those failing to meet the threshold were removed.

Translations[edit]

See also[edit]