Talk:Principle of maximum entropy

Latest comment: 2 months ago by Uhbif19 in topic Conditional Limit Theorem

2003 dicussion

edit

I find this discussion very doctrinaire and probably incomprehensible to most mathematicians for lack of context; maybe I'll do some more substantive editing later. Michael Hardy 17:46 Mar 30, 2003

Doctrinaire - a person inflexibly attached to a practice or theory without regard to its practicality. Online dictionary definition. Hey, I just tried to describe what it is - whether or not it's valid is an issue that requires its own subsection. Since most of what I've read on the subject of the validity of PME was written by its proponents, I have the information to give only one side of the story (from as N a POV as I can manage).Cyan 07:37 Apr 1, 2003 (UTC)

I don't claim to be a mathematician, and yet with a few terms of calculus and discrete math under my belt I find this presentation to be very accessible. I don't see how it can be made any more accessible without sacrificing content. I learned a few new things from this page (ie proving that ME solution is also ML solution) that I haven't come across when browsing papers on maxent.

I have some knowledge on how the algorithms that approximate maximum entropy solution work (the GIS and the IIS), if there's demand for it, perhaps I should post some info? yaroslavvb Jun 3, 2003 (PST)

Absolutely. But the PME page is already rather long. I suggest you create a new page (or pages) for these algorithms, and provide links to and from the PME page. Cyan 04:35 5 Jun 2003 (UTC)

<Mild chagrin> See also the second rule 25 on Wikipedia Anti-Rules. Cyan 21:53 Apr 3, 2003 (UTC)

What I meant by "doctrinaire" is that it imitates closely the language of Edwin Jaynes and may be incomprehensible to those unfamiliar with Jaynes' writings. One of these days I'll edit this article, but for the Time Being I have other obligations. Michael Hardy 01:36 Apr 4, 2003 (UTC)

equation minus sign

edit

I think the minus sign on the equations to find lambda values is wrong. I'll remove it. --163.117.155.37 18:01, 12 January 2007 (UTC)Reply

Epistemic probability?

edit

I've never seen that term used. It seems out of place in a mathematical context, and more appropriate to philosophy. I recommend changing it to the more standard term "Bayesian".

Who are you, that the fact that YOU have never seen it should be considered significant? I think it conveys the idea better than "Bayesian". Michael Hardy 21:12, 11 October 2005 (UTC)Reply
I think it's a good term here, underlining that we're talking about probabilities being used to handle a lack of complete knowledge. Bayesian writers are often keen to stress that Bayesian inference and Bayesian methods are part of epistemology -- ie how to handle knowledge and incomplete knowledge; rather than ontology -- statements about the actual nature of the world. They are also clear on the value of always keeping the two clearly distinguished. Jheald 15:36, 20 October 2005 (UTC)Reply

new edit (now undone by me), removing the word 'epistemic', should have consulted or given better reasons for the change

edit

We have just seen, and I have just undone, an edit that removed the word epistemic on the grounds that this article has nothing to do with sociology or philosophy, and that the word "epistemology" has no place here. I think the editor who made this edit should have consulted this discussion page before making it. Philosophy has a place anywhere in rational discussion, and to say that it doesn't have a place here seems odd at best. No one imagines that the use of the word epistemic introduces sociology to the conversation. A discussant, who did not sign his name, writing in this section of the discussion, prefers to speak of Bayesian probability, but that is not necessarily better than epistemic probability, even if it seems more "standard". The use of the term Bayesian has disadvantages, and Jaynes says that he does not like to be called a Bayesian, because his thinking includes ideas not held by Bayes. JHeald's comment of 20 October 2005 seems to address this point in a reasonable way. The notion of probability is not simply mathematical but has a wider range of signification. It comes into scientific inference, as for example in Jaynes's Probability Theory: The Logic of Science, very relevant here. Perhaps the editor who made this edit removing the word epistemic would tell us what he thinks of Jaynes's arguments that are presented in that book.Chjoaygame (talk) 18:50, 15 April 2010 (UTC)Reply

Personally, I do not mind the word "epistemic" at all. I think it conveys some meaning in this context. However, the following sentence from the current version of the article strikes me as funny:

In ordinary language, the principle of maximum entropy can be said to express a claim of epistemic modesty, or of maximum ignorance.

You call that ordinary language? Dratman (talk) 06:45, 10 December 2015 (UTC)Reply

The term "epistemic probability" is only substantively discussed on the Probability interpretations page and does not occur on any other statistics page, except as a redirected link to Bayesian probability. As such the term used here should be Bayesian. If you want to use epistemic, a link (and justification) should be made to Epistemology or some primary source (perhaps Probability interpretations). But it is confusing and incongruous to replace every occurrence of "probability" with "epistemic probability" as seems to have been done in this article. The Probability page does not contain this term, and neither does Epistemology. The Bayesian probability page contains no mention, but does have a link at the end to two articles on "Bayesian Epistemology". Neither of these articles contain the phrase "epistemic probability". Both articles are written by philosophers, not mathematicians or physicists. This phrase is not in use by any mathematician or physicist I have ever heard (I am a Ph.D. theoretical physicist -- and we use a lot of Bayesian statistics in my field). When I first read this page I jumped and said "what the hell is that?!?!" The word "Epistemology" is never mentioned in scientific circles. As a philosophical discipline, one could provide a link to it...but the concept desired here is most commonly called Bayesian probability for which it seems some philosophers have chosen to create a synonym Epistemic probability. This synonym does not appear to be in sufficiently widespread use to justify its repeated use in this page, and Wikipedia itself agrees by redirecting the link to Bayesian probability.

Thus, this term should be removed. The only relevant modifier to "probability" in this context should be "Bayesian" where appropriate. But even in that case, every occurrence of "probability" should not be rewritten as "Bayesian probability" as it is unnecessary and confusing... So, please revert my edit?  ;) —Preceding unsigned comment added by 78.42.131.83 (talk) 07:26, 16 April 2010 (UTC)Reply

Thank you for your civilized response to my undo. You put a good case, but I would like to ask your indulgence in letting me (and perhaps others) sleep on it and think carefully about it for a day or so. It is perhaps not a good thing that the word 'epistemology' is never mentioned in scientific circles; perhaps they are very tight circles that exclude the hoi polloi, members of the plebs like me; I have often thought of myself as a scientist, but it seems that I don't move in the right circles. The distinction between epistemic and ontic concepts is very relevant to the scientific meaning of probability, and failure to use it clearly has been a cause of confusion about the physical meaning of probability and a basis for attacks on the Bayes-Laplace-Jeffreys approach. You seem keen to draw careful or perhaps strict boundaries between disciplines? There are good arguments against eponymous nomenclature, such as 'Bayesian'. Bayes himself did not use the notion of probability as such. He worked with expectations. E.T. Jaynes (Probability Theory: The Logic of Science, 2003) on page 22 writes "Common language - or at least, the English language - has an almost universal tendency to disguise epistemological statements by putting them into grammatical form which suggests to the unwary an ontological statement." Would Jaynes be admitted to scientific circles? The first sentence the first chapter of the third edition of Jeffreys' Theory of Probability reads "The fundamental problem of scientific progress, and a fundamental one of everyday life, is that of learning from experience." He goes on further down the page to write "The theory of learning in general is the branch of logic known as epistemology." Would Jeffreys be admitted to scientific circles? Nevertheless you put a good case. Again, please would you be willing to indulge me with a little time to think it over? Perhaps others may also have opinions? Please note that you can sign your comments with four tildes in a row.Chjoaygame (talk) 14:35, 16 April 2010 (UTC)Reply

Epistemology and Ontology are never discussed in scientific disciplines, in my experience. The terms are only in use by philosophers, psychologists, sociologists, and other people that ruminate about data, but generally don't actually have any data. The ideas surrounding those terms tend to be lots of words, no proofs, and no practical calculational tools that would be used by scientists. Bayes' theorem is a precise mathematical statement that can be (and is) used to analyse data. Epistemolgy and Ontology are not (even if grammatically and logically the words may apply to scientific concepts). BTW your arguments apply to every single statistics page on wikipedia. If one wanted, one could insert 17 philosophical adjectives before every mathematical construct like probability. But such usage is confusing, and mixes disciplines. To put it another way, my dictionary defines "epistemic" is "of or relating to the philosophical theory of knowledge" (e.g. philosophy), and the word "probability" already encodes the "knowledge" piece. I looked through wikipedia, and I find no other articles using "epistemic probability" in the way this page does -- though there are many pages touching the topic of Bayesian statistics. Indeed, Jaynes' book also does not even contain this phrase.

I would suggest that discussion about Epistemology and the philosophical nature of probability should occur on a philosophy-oriented page (such as Probability interpretations). But the current page is very scientific in nature, so should use the nomenclature common to mathematicians and physicists. I'll let you decide if a link to Probability interpretations is relevant. That is not my expertise. 129.206.195.62 (talk) 16:06, 16 April 2010 (UTC)Reply

Thank you. I am just now asking for a day or two to sleep on it and examine the page more critically, and to see if perhaps anyone else has an opinion. Perhaps it is within your field that you might tell us here and now what quantum mechanics people think of or practise in Bayesian methods?Chjoaygame (talk) 21:37, 16 April 2010 (UTC)Reply
Looking over the page, with the question in mind about the use of the word epistemic, I am sad to say that, as usual, it seems that the Wikipedia article is not really well conceived and constructed. The use of the word epistemic partly reflects, I think, a feeling of the editor who used it, that, for clarity about his points, some attention needs to be drawn to the distinction between frequentist and Bayes-Laplace-Jeffreys conceptions of the word probability. On looking at the article on Bayesian probability, one is sad to note that it misquotes Jaynes when it writes 'Bayesian probability interprets the concept of probability as "a measure of a state of knowledge",[citing E.T. Jaynes. Probability Theory: The Logic of Science Cambridge University Press, (2003). ISBN 0-521-59271-2]'. Using the [Find] facility of Adobe Acrobat, I searched chapters 1-4 of that book, in which I suppose the quoted phrase is most likely to be found, and I did not find it there. Jaynes there does often write of "degree of plausibility". I am guessing that the Wikipedia editor of the Bayesian probability page felt he knew better than Jaynes what Jaynes really meant, and felt that he could improve on Jaynes' wording, and was so confident about this that he could just use quote marks to make his point. Amongst those who have superior insight and judgement, what is the slight difference between plausibility and knowledge, that someone would fuss about it? Silly old Jaynes, speaking of plausibility when we all know that he meant knowledge. Well, Jaynes uses the word knowledge many many times in those chapters. Silly old Jaynes, he couldn't know that scientists work with data and don't get into a tangle about philosophy. Dear unsigned edit-proposer who wants to expunge the deviationist unscientific philosophical word epistemic from the article, because it made you feel "what the hell is that?", I think you would better spend your efforts improving the article on Bayesian probability or on clarifying within this Principle of Maximum Entropy page the relevance of the distinction between frequentist and the Bayes-Laplace-Jeffreys approaches to conceiving the word probability. As I read it, the key to the Bayes-Laplace-Jeffreys approach is that it defines a probability as a number that expresses a logical relation between two propositions, the datum proposition and the object proposition: it conceives of the degree of logical plausibility of the object proposition given just precisely the datum proposition. This key does not seem to me to be made explicit in the page on Bayesian probability. Talk of knowledge here is not exactly to this point, though of course it is very relevant. Who would wish to speak of so unscientific a notion as plausibility? Only a silly old fool like Jaynes. Dear unsigned edit-proposer, I think your proposed expungement of the evil word epistemic is hardly likely to improve the understandability of the sections where it is used, though it might perhaps make you feel better. Perhaps I should leave it to the editor who used the word epistemic to defend his work. If you still want to impose your strict censorship on interdisciplinary thinking, you may go ahead and I will not undo it, at least not right now, while I am in my present frame of mind, though I reserve the option of changing my mind.Chjoaygame (talk) 17:44, 17 April 2010 (UTC)Reply

MLE and bayesianism

edit

In the current article, we can read

maximum entropy principle is like other Bayesian methods in that

implying that MLE is a bayesian method. But, up to my knowledge, this claim is controversial (for instance R.Neal said :Maximum entropy is not consistent with Bayesian methods). Should we modify this sentence? For more information on this debate, here is starting a discussion, with good pointers.--Dangauthier 16:48, 16 February 2007 (UTC)Reply

Looked at your blog quickly, didn't work through the example, but the result looks very fishy to me.
Given that the Principle of Maximum Entropy with respect to an invariant measure is essentially the same thing as Kullback's Minimum Discrimination Information (MDI) principle, you might like to look at Relative_entropy#Principle_of_minimum_discrimination_information.
That seems to me to show why MDI should replicate Bayes where Bayes is applicable.
Can you diagnose why your example is different ? Jheald 17:49, 16 February 2007 (UTC)Reply
Even Bayesians disagree what "Bayesian" means. IE, is MAP Bayesian? Or must inference about models use model averaging? There's no consensus. There have been papers showing how MaxEnt can be massaged to look like a special case of Bayesian approach, and vica versa, the whole disagreement is mostly about semantics. --yaroslavvb
There's another discussion of the example here with a useful contribution from John Baez.
Basically, there are two different scenarios that need to be distinguished: Does the constraint only apply to our prior probability assessment, so that once we have formed that prior probability, we can forget the constraint and just apply Bayes' theorem using that prior distribution? Or must the constraint also apply to the posterior distribution - in which case it has to be included as a nuisance variable in its own right in the model, and explicitly conditionalised on in a Bayesian inference. Jheald 23:47, 3 March 2007 (UTC)Reply

There are basically three positions in the literature on the relationship of Maximum Entropy's "constraint rule" and Bayesian conditionalization: 1) the two conflict; 2) the two are complementary; and 3) MaxEnt generalizes Bayesian conditionalization. The somewhat scattered literature on this is surveyed and summarized nicely in:

Uffink, Jos (1996). "The Constraint Rule of the Maximum Entropy Principle" (PDF). Studies in History and Philosophy of Modern Physics. 27 (1): pp. 47-79. {{cite journal}}: |pages= has extra text (help)

Recall that we're just here to summarize the literature, not to ourselves resolve the argument. --Delirium 01:44, 4 August 2007 (UTC)Reply

I'm confused by the statements Bayes theorem and MaxEnt are in conflict. Could someone summarize that argument instead of linking to papers? I went through the Jaynes paper 'Monkeys, kangaroos and N'. According to the page Jaynes argues that MaxEnt and Bayes are in conflict, but I just went through this paper and couldn't find anything to that effect. The paper is mostly about how Skilling and Gull have been sloppy when they were trying to apply MaxEnt to image deconvolution. He is basically trying to amend their papers.Jfischoff (talk) 00:51, 31 July 2009 (UTC)Reply

I just read http://bayes.wustl.edu/etj/articles/relationship.pdf and Jaynes is pretty clear that he thinks the bayes theorem and MaxEnt are congruent. They are meant for different things, and are actually limiting forms of each other in certain situations. I am going to update the page to reflect this.Jfischoff (talk) 00:56, 31 July 2009 (UTC)Reply

neccessary and sufficient condition for a sufficient statistic

edit

(what a confusing title...) I was intrigued by the comment about the n&s condition for the existence of a sufficient statistic, so I read Pitman-Koopman_theorem and it says: "...only in exponential families is there a sufficient statistic whose dimension remains bounded as sample size increases." I rephrased the comment slightly but I know very little about the subject, so if any of you knows a little about the subject please consider fixing it...

Amitushtush 11:30, 22 September 2007 (UTC)Reply

Paradoxes

edit

Jaynes had a lifelong controversy with authors who claimed that his maxent principle would allow paradoxes, most famously the Dawid-Stone-Zideck paradox. After his death, the paradox was acknowledged by Kavin van Horn. I do not see any mention of paradoxes at all. -- Zz (talk) 13:30, 18 January 2008 (UTC)Reply

Well, this is only a year and a half late, but what the heck. The DSZ paradox is an argument against improper priors, not the maxent principle. Cyan (talk) 01:48, 8 August 2009 (UTC)Reply
The paradox is also not a paradox, as Jaynes argues in his 2003 book "Probability Theory": DSZ simply made a mistake in the derivation; probabliy due to uncareful notation they assume that two cases that are different are the same; if this mistake is resolved, then it appears that in most of their examples, the paradox goes away for one improper prior which, if the parameter in question is a scaling parameter, turns out to be Jeffrey's prior. So instead of treating the DSZ thing as a paradox, it could be seen (and has been seen as such by Jaynes) as a new and elegant mathematical way to find an uninformative improper prior.147.86.212.162 (talk) 06:14, 28 August 2009 (UTC)Reply
I think most people now seem to agree that Jaynes was wrong on the marginalization paradox, such as here. There is also a reply from the original authors. It is somewhat relevant to the article, as many maxent priors are improper, but perhaps it deserves an article of its own. —3mta3 (talk) 21:39, 27 September 2009 (UTC)Reply
I just ran across this three-year-old comment, and I have to correct it, as it misrepresents what I wrote (I'm Kevin Van Horn). Yes, Jaynes made a mistake in attributing the "paradox" to two distinct states of information. However, the Marginalization Paradox is not a paradox -- it is a failure to strictly follow the rules of probability theory. The problem is that DSZ go directly from "p(zeta | y, z) does not depend on y" to "p(zeta | z) = p(zeta | y, z)". This step requires integrating out y: p(zeta | z) = (INTEGRAL y: p(y | z) * p(zeta | y,z)). The problem is that p(y | z) is also improper, so you're taking a divergent integral, making this step invalid. Ksvanhorn (talk) 01:08, 20 March 2011 (UTC)Reply

least biased?

edit

The term "least biased" in the 2nd sentence seems likely to be incorrect, at least if "bias" has its usual statistical meaning Least incorrect? Least additional information? Melcombe (talk) 10:09, 30 April 2008 (UTC) Reading again, perhaps it would be better to just have "It finds the distribution that..."Melcombe (talk) 10:14, 30 April 2008 (UTC)Reply

Relativity

edit

That relativity bit in the lead seems pretty creative to me, and I'm not sure I buy the metaphor. Is this attributable to anyone outside of WP editor(s)? Cretog8 (talk) 03:11, 12 July 2008 (UTC)Reply

OK, I'll cut it. CRETOG8(t/c) 07:42, 4 September 2008 (UTC)Reply

Thermodynamics

edit

I can't help to wonder why there is no mention of maximizing the thermodynamic entropy when a search on "maximum entropy" jumps directly to this article. Even if the statistical entropy and the thermodynamic entropy have finally merged seamlessly, I would include a comment relating to this age-long dispute. After all, the maximization of the thermodynamic entropy is fundamentally the same thing as the minimization of any other fundamental (energy) function. The derivation, in either direction, is a simple Legendre transform of the corresponding fundamental equation and should not need a reference. ChemGamer (talk) 12:36, 21 July 2008 (UTC)Reply

Check the old intro [1], which an editor recently changed.
Do you think the old version was more appropriate? Jheald (talk) 14:51, 21 July 2008 (UTC)Reply
The old version is nice in that it contains the word thermodynamics and references Gibbs. It is, however, still not referring to thermodynamic entropy as a macroscopical function. I read the related article Maximum entropy thermodynamics, which is, unfortunately, also about the statistical variant. I would not change the introduction or necessarily any other major part of this article. Perhaps adding a reference to an article about optimization of fundamental equations would be appropriate? My suggestion is this page on a nearly perfect topic Principle of minimum energy. ChemGamer (talk) 10:53, 22 July 2008 (UTC)Reply
Heh. Be aware, there was a big dispute earlier this year on WP; apparently, many chemists utterly dislike the way that mathematicians and info-theorists have hijacked entropy. So now we have Entropy (general concept), Entropy (information theory), Entropy (statistical thermodynamics), and many others, and the cultural divide seems to be too great to leap. Similarly, on the statistics side, there is a large body of work that sticks to "their" definition, and rather dislikes the intrusion of physics into conversations. linas (talk) 22:16, 3 September 2008 (UTC)Reply

Conserved Quantities

edit

The first sentence in the Overview section of the article reads:

In most practical cases, the testable information is given by a set of conserved quantities (average values of some moment functions)...

I do not understand why conserved quantities serves as testable information. Consider a distribution over energy, if I say energy is conserved, then the testable information becomes the expectation of energy. I can not see any relation between these two. Why the testable information related to energy conservation can not be the standard deviation of energy or some thing else? The word "conserved" here is also confusing. The probability distribution implies that the energy is fluctuating, then how can a fluctuating quantity being considered as conserved? Everett (talk) 13:12, 26 August 2009 (UTC)Reply

intention to merge, into this page, the page Law of maximum entropy production, not a good plan

edit

With all respect to the administration, I would like to suggest that it is not a good plan to try to merge the page Law of maximum entropy production into the page Principle of maximum entropy.

The latter page announces at its top: 'This article is about the probability theoretic principle. For other uses, see Maximum entropy." The old page Law of maximum entropy production has nothing about the probability theoretic principle of maximum entropy. Though logically and in principle possible, it is a very tall order of intellectual power that would be needed to write anything about how the probability theoretic principle applies to study of the thermodynamic principle of maximum entropy production. I have much doubt whether we have available to us such intellectual firepower in this area. As far as I can see, the likely outcome of an attempt by an ordinary Wikipedia editor to carry out the intended move is that it would just introduce a mess into the present reasonably satisfactory page on the Principle of maximum entropy, unless that editor was aware that nothing on the page on the Law of maximum entropy production was relevant to the present page on Principle of maximum entropy, and that the present page on the Law of maximum entropy production is seriously faulty in content, in which case the editor would move nothing; perhaps that is the intention of the administration.

The problem with the old Law of maximum entropy production page was that it was faulty in content, not that its content was located in an unsuitable place. I am not clued up on how to characterize faulty in content in Wikipedia style, but a stab at it is that the article as it stood was advertisement, it was verging on being a hoax, that it was written solely to push a very shaky point of view that is far from the mainstream of study of the thermodynamics of entropy production, while it purported to be the key to that mainstream, and that it was partly original research that serioulsy misrepresented secondary sources to make it seem as if they provided reliable support for Swensonism. The old page was written to advertise and push Swensonism, which is pseudo-science; it had no other purpose. To push this stuff into a reasonably carefully worked page on the probability theoretic principle of maximum entropy seems to me to be a bad plan, indeed to play into the hands of an advertiser.Chjoaygame (talk) 06:37, 18 October 2009 (UTC)Reply

I agree with this post. Feel free to list the page on Wikipedia:Deletion review and propose a redirect to Extremal principles in non-equilibrium thermodynamics instead. --Classicalecon (talk) 14:54, 18 October 2009 (UTC)Reply
Agree as well. This page is clearly about the concept of entropy in probability, and although it has many critics, it has been the focus of a lot of work over the years. The other page appears to be a pile of incomprehensible gibberish, and I can't see how it could be relevant to this article. —3mta3 (talk) 15:32, 18 October 2009 (UTC)Reply
Ignoring the content for a minute, other than the term entropy and some common names like Jaynes and Shannon, I'm not sure the concepts are similar enough to merge. Certainly there is literature on the "static"( macroscopic propety avaerages don't change) case that ignored dynamical approach to equillibrium.Production refers to the time derivative and what path a system should take to equillibrate. Certainly out-of-equillibrium in the general case opens up definitional issues. I was content with a page something like "entropy production principles" but that was expanded to general "extreme value" things. I just wanted to save the LMEP talk page in my user space and thought a passing mention of Swenson would be unavoidable in some article and it would be less work to keep the links and Swenson bibliography we now have. Nerdseeksblonde (talk) 19:25, 18 October 2009 (UTC)Reply
Ditto per the above. There's actually technically no reason to take it to Wikipedia:Deletion review. Technically, all the AfD decided was not to delete outright the page and its history. Technically, the merge is no more than a suggestion. What actually happens next is up to the editors of the pages, so long as it broadly takes into account the issues raised at the AfD; so long as that reflects an updated consensus of editors.
Compare this earlier discussion from User:Aervanath about a proposed merger of the article Judeo-Christian which was subsequently re-thought: User_talk:Aervanath/Archive_3#Judeo-Christian
A redirect to Extremal principles in non-equilibrium thermodynamics, with as much (or as little) of the content currently in the page merged there as deemed useful would seem to me an entirely satisfactory outcome, per the views expressed at the AfD, and the decision taken. Jheald (talk) 19:54, 18 October 2009 (UTC)Reply

sorry. deleted section posted in error

edit
Are you sure you posted this to the right talk page?
This article doesn't even mention Swenson, whoever he is. It's mostly about E.T. Jaynes and his followers; and is recognised as an important and mainstream in probability theory and in philosophical discussions of the foundations of statistical thermodynamics. Jheald (talk) 00:43, 8 January 2010 (UTC)Reply

Thank you for your kind advice. Yes, I am very sorry, I posted this to the wrong page. I meant to post it to the page on the law of maximum entropy production. I have deleted it from this page to prevent muddles. Yes I am a keen fan of E.T. Jaynes.Chjoaygame (talk) 10:47, 8 January 2010 (UTC)Reply

unresolved reference

edit

The article mention some reference: Refer to Cover and Thomas for excellent explanation of the ideas . But I did not find anything about that in there references. --84.179.23.247 (talk) 10:30, 30 August 2010 (UTC)Reply

Because it was never there: http://en.wikipedia.org/w/index.php?title=Principle_of_maximum_entropy&diff=next&oldid=80931656 --Gwern (contribs) 19:11 24 September 2010 (GMT)

Good edit !

edit

Good edit! Chjoaygame (talk) 19:38, 27 January 2013 (UTC)Reply

Maximum entropy estimation

edit

Maximum likelihood estimation has its own page. I'm not sure whether MEE should be covered here (this is already a large page), or on its own page. MEE is not the most common term in academic paper titles, but it's not a ghost town, either. — MaxEnt 17:47, 7 May 2017 (UTC)Reply

I should add that the problem for me is that many papers in machine learning embed the phrase "maximum entropy" in the title, and it implicitly references either a form of maximum entropy estimation, or a class of objective functions which take you to much the same place. I'm just going to lump all these papers under MEE in my own notes. I'm certainly not using Principle of maximum entropy as the topknot for this collection of papers. If the current carving of the ME cake doesn't work for me locally, does it server the readers here any better? Definitely scratching my own itch, but it raises a broader question. — MaxEnt 17:57, 7 May 2017 (UTC)Reply
edit

Hello fellow Wikipedians,

I have just modified one external link on Principle of maximum entropy. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 01:41, 30 December 2017 (UTC)Reply

make page: Bayesian entropy as the mechanism of time

edit

Why do we have an arrow of time? Wave functions in nature act as noisy grains (not single movie frames) and update their potentiality but in a quantized manner. If we hit these regions with particles or if they reach a maximum level of informational entropic potentiality, they're forced to collapse their potentiality into a new state that continues to explore in a Bayesian way the most probable interactions. Even huge system can have the same informational entropy if their spins are aligned so that they cause an informational simplification. Big Bang was the time when all the old then universe was simple enough to be able to be described by a single quantum of time - a single entropic package. You might claim that all these are bullshit. I don't disagree with you, but we have to mention all theories. Erroneous theories have things to offer. Some notions of the erroneous theories are absolutely correct if separated. The internet isn't North Korean. We must make new pages and reveal what people think. If something is silly comment: few people support it, no lab data support it, but create and reveal even ideas you don't like.

Dark matter occurs in regions when we have few atoms and molecules, but don't reach or reach at a slower pace the conclusion of wave functional spread out of maximum entropy (natural process of calculation). Potential particle paths (potential paths of particles) have weight and mass compared to the void. If the pace of reaching the maximum wave functional entropy is (paused or) slower, then dark matter becomes heavier.

Not only humans calculate. The Universe wave functionally calculates itself (something like a primitive tactile calculation, it seems silly to some but it's accurate). That's a fact.

Lacking of what it must be maximized in the discrete case

edit

I think in the "Discrete case" paragraph, it should first be taken into consideration the MEP, exhibiting the ME expression, ie what it has to be maximized, just as it has been done in the continuous case. Filipsnew (talk) 19:49, 19 December 2018 (UTC)Reply

interesting topic but missing some content

edit

If we talk purely about decimals, what numbers contain maximum entropy in the distribution of numbers, (ea what would be the worst number to compress by an ideal compression formula) From the possible number, how do the physic constants and mathematical constants compare to it?, I've not studied this area, but its easy to imagine that it has some impact on various current theories about our universe. If indeed our universe was is pure information based. Also closely related is how much information can a blackhole surface contain, or any theoretical surface size in our universe. — Preceding unsigned comment added by 2001:1C04:3806:7E00:2500:37FD:FDF7:46BB (talk) 13:29, 10 August 2022 (UTC)Reply

Entropy maximization merge

edit

Support the 2019 proposal to merge Entropy maximization to here, specifically in the Discrete section. The contents are duplicated, so the only thing really to move is the 'reference'. Klbrain (talk) 09:08, 3 August 2020 (UTC)Reply

Intuition and examples section

edit

I find the section (Principle of maximum entropy#Intuition and examples) is far too tutorial in tone, containing such language as "for example", "it is intuitive that", "which we did not have in the dice example", "should map to", "we can define", "if we knew anything of the geometry", etc. I also note that no source has been provided for this. It's been the subject of recent revert-revert-revert. Any interested eyes are welcome. signed, Willondon (talk) 19:53, 8 December 2023 (UTC)Reply

I thank the anonymous editor for trying to improve Wikipedia. I agree with them, that the "Overview" starting with conserved quantities is a bit abstract and difficult.
I only partially agree with Willondon's comments. "It is intuitive that" is not encyclopedic tone, but "for example" is. Regardless of that, a fruitful approach is to get material into Wikipedia (with citations, unless WP:CALC applies) and then to clean up the wording and tone.
Unfortunately, I do not think that the intuitive overview offered by the anonymous editor is better than what happens in the section "Testable information". The intuitive overview makes a bunch of statements that are indistinguishable from a tutorial on elementary probability theory.
Perhaps a useful approach would be: Present a non-trivial, concrete example that explicitly acknowledges how it uses all of the key ingredients. Then do the general definition and theory.
(I say all of this as someone who is not an expert in probability.) Mgnbar (talk) 23:07, 9 December 2023 (UTC)Reply
I concur that the text in question is not suitable as written, although having something in that spot to provide a more gentle introduction than jumping into "average values of some moment functions" and "statistical thermodynamics" would be nice. So, good idea, not yet encyclopedic in the execution.
Simplicity and elegance are in the eye of the beholder, and intuitiveness is a subjective thing. When writing encyclopedia material, it's generally best to avoid appeals to them, or at least make clear whose intuition we are reporting. ("E. T. Jaynes argued that this formalizes the intuition that...") XOR'easter (talk) 18:46, 10 December 2023 (UTC)Reply

Conditional Limit Theorem

edit

I am not 100% sure, but it seems to me that Conditional Limit Theorem is formalization of Wallis Derivation, at least in discrete case. Uhbif19 (talk) 14:50, 22 August 2024 (UTC)Reply