Talk:Multivariate mutual information
This redirect does not require a rating on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | ||||||||
|
The contents of the Multivariate mutual information page were merged into Interaction information on 16 April 2021 and it now redirects there. For the contribution history and old versions of the merged article please see its history. |
More ideas
editI finally decided to create this article since so many of the other articles I have edited alluded to this concept. I hope other editors will see fit to add more ideas to this article rather than hastily delete it. Deepmath (talk) 20:44, 29 July 2009 (UTC)
- It looks good to me. --Apoc2400 (talk) 23:12, 27 August 2009 (UTC)
I would exploit the relationship between Multivariate MI and pointwise multivariate MI. The later is used extensively in NLP and increases the real-world relevance of this topic -- motivate by example. SEE: reference — Preceding unsigned comment added by 76.21.11.140 (talk) 04:00, 5 June 2013 (UTC)
I find this page very misleading. Multivariate mutual information is sometimes used to refer to interaction information, the interpretation described here, and sometimes to refer to total correlation. Both should be mentioned, along with references to usage. I'm new to editing, but will try to collect some examples (e.g. Kraskov et. al., "Estimating Mutual Information" assumes the latter). Multivariate mutual information is a general enough term that contributions from Williams and Beer, Griffith and Koch, and more, should be included here. --Bobrodsky (talk) 23:29, 21 July 2014 (UTC)
I agree with the above. This page is very confusing becuase "Multivariate mutual information" is not a term used in this sense at all in the literature outside of this wikipedia article. Terms that have been used include the original "interaction information" (McGill 1954), multiple mutual information (opposite sign) (Han 1980), co-information (opposite sign) (Bell 2003), or separately as redundancy or synergy (Gawne and Richmond 1993; Schneidman et al. 2003). Even in the references current listed (eg Timme) they dont use the term "Multivariate mutual information" - this appears to be a wikipedia invention and is confusing for many reasons (eg the ambiguity above). I think this article should be removed and merged into "interaction information" - that is the earliest and in my opinion the clearest term for the quantity. 130.209.89.69 (talk) 11:18, 13 January 2016 (UTC)
I actually like this page a lot. It generalizes mutual information with a sign convention that is consistent with the Venn diagram picture which is elaborated in Information theory and measure theory where this page and multivariate mutual information is also mentioned. Kjslag (talk) 00:55, 26 January 2018 (UTC)
Confusion
editThe second sentence, "These attempts have met with a great deal of confusion and a realization that interactions among many random variables are poorly understood", REALLY needs a citation.
Srinivasa, Sunil (2005). "A Review on Multivariate Mutual Information" (PDF): 6. {{cite journal}}
: Cite journal requires |journal=
(help)
suggests that actually MMI is fairly straightforward and usable. Unfortunately I don't feel like I have my head around the subject well enough to make a reasonable assessment which is more true, and I can't find many other reviews.. --naught101 (talk) 02:37, 7 June 2018 (UTC)
Thanks
editThanks for this article. I have struggled for years with these ideas and your article helps. Here is what I had written years ago on the topic. https://www.eecs.yorku.ca/~jeff/courses/6111/syllabus/6111-22-entropy-short.pdf https://www.eecs.yorku.ca/~jeff/courses/6111/syllabus/6111-Entropy.pptx All the best Jeff JeffAEdmonds (talk) 14:48, 18 May 2020 (UTC)
It's not a measure
editI deleted references to measure theory for the same reasons given at Talk:Information_theory_and_measure_theory#Reasons_for_deletion, namely:
- There is no sigma-algebra.
- There is no proof that MMI is countably additive.