Talk:List of cognitive biases/Archive 1
This is an archive of past discussions about List of cognitive biases. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | Archive 2 |
Name/Organization of article
Can we call this something more specific than "list of biases"? There has to be a better, less confusing name for this category. - Texture 18:19, 7 Mar 2004 (UTC)
- Perhaps List of psychological biases? Isomorphic 18:23, 7 Mar 2004 (UTC)
- Much more descriptive. Any objection to moving this article to the new name? - Texture 18:28, 7 Mar 2004 (UTC)
- If you wanna move it, fine with me. My only thought is that all statistical biases could be considered ultimately cognitive ones. 67.86.96.95 18:31, 7 Mar 2004 (UTC)
- Would List of cognitive biases be better than List of psychological biases? - Texture 18:33, 7 Mar 2004 (UTC)
- Better ring to it, covers what I had in mind, yes. 67.86.96.95 18:38, 7 Mar 2004 (UTC)
- I'll move to this article to List of cognitive biases - Texture 18:59, 7 Mar 2004 (UTC)
Maybe the list should be hierarchical. Categories include attributional biases, statistical biases, self-serving biases, group-serving biases, reporting biases (though there might be some overlap between some of them) 67.86.96.95 08:45, 7 Mar 2004 (UTC)
The list is not accessible
I think the list is not accessible. Each bias should mention the following
- whether this is an informal term only used by lay people or or a term used by psychologists
- whether the bias has been empirically verified
- whether the bias is undisputed by scientists.
- what other biases are closely related, overlap or are a part of it.
I can't do it myself because I don't know enough about the subject. Thanks for helping. Andries 15:56, 16 May 2004 (UTC)
- I agree with Andries. On another note, this is a really great list subject!! If only we could get Bill O'Oreilly and Michael Moore to read it... --64.121.197.36 18:49, 21 Aug 2004 (UTC)
- I agree. I'll probably be working on this over the next few days, any help is appreciated. J.S. Nelson 07:57, 25 Apr 2005 (UTC)
- I've done my best to improve this list (organizing it into obvious catagories and adding short descriptions). Please continue to update and improve this list as y'all see fit. Headlouse 01:33, 16 November 2005 (UTC)
Victim fallacy
People tend to assume that their problems are unique to them or a group to which they belong, when these problems are very often widespread and affect many people.
Is there a name for this bias or fallacy? I cannot find anything in the list which mentions it. I would tend to call it the "victim fallacy".
LordK 19:13, 20 Nov 2004 (UTC)
Coercion bias?
This is just an unconfirmed rumor I read somewhere. If there is a person with an opinion, people tend to agree to him more in person than not in person. So this can lead to bias in that individual. But I am not even sure if this belongs here. Samohyl Jan 23:35, 21 Dec 2004 (UTC)
Why is there a separate "Other cognitive biases:" list?
Why is there a separate "Other cognitive biases:" list? There is no indication to the reader of any reason that these are separated out. I'd suggest they either be integrated in, or an explanation give to the reader why they are separate.
- I am not even convinced that things like optical illusions are cognitive biases in the same sense as the other things on the list. It might be better to remove them entirely, or put them under 'see also.' Tom harrison 00:21, 21 October 2005 (UTC)
Tunnel Vision
Just wondering if this is correctly linked? AndyJones 18:19, 9 November 2005 (UTC)
Support for merging this page with Cognitive Bias
Like previous users, I see no reason for not merging this with the cognitive bias page. I would need help to do this.. any offers please?? --Rodders147 11:26, 19 March 2006 (UTC)
- I don't know about the merging. It is a pretty long list, after all. --maru (talk) contribs 18:14, 19 March 2006 (UTC)
- I Agree with Marudubshinki. This list is too long to include on the cognitive bias page. Plus there are several "List of _________" pages on wikipedia so this is not abnormal. Headlouse 22:59, 30 March 2006 (UTC)
Lake Wobegon effect
Where did this list come from? A lot of these, such as Lake Wobegon effect, must have some other name in psychology. It seems like some serious merging would help make these topics more informative. --65.25.217.79 10:12, 21 April 2006 (UTC)
Lake Wobegon effect, egocentric bias and actor-observer bias are closely related. Distinguishing (or merging) them requires expert advise. Peace01234 (talk) 03:52, 25 November 2007 (UTC)
confusing causation with correlation
It seems as though the common phenomean of assuming causation when all that exists is correclation or association should be on this list. Maybe it is and I could not identify it. Not my area of expertise, hope someone will address this. BTW: to those who maintain this page, its a wonderful resource!
- Correlation implies causation is mentioned in Logical fallacy article. -- Sundar \talk \contribs 16:22, 26 June 2006 (UTC)
Contradiction?
"Déformation professionnelle — the tendency to look at things according to the conventions of one's own profession, forgetting any broader point of view." It seems to me that this line is self-contradictory. Not that it should be removed, but if a psychologist discovered this bias how would this psychologist know whether or not he/she was subject to this bias (the bias of a psychologist) during its discovery? Anyway, I just thought that was interesting. --Merond e 14:04, 18 March 2007 (UTC)
Reductive Bias
"A common thread running through the deficiencies in learning is oversimplification. We call this tendency the reductive bias, and we have observed its occurrence in many forms. Examples include the additivity bias, in which parts of complex entities that have been studied in isolation are assumed to retain their characteristics when the parts are reintegrated into the whole from which they were drawn; the discreteness bias, in which continuously dimensioned attributes (like length) are bifurcated to their poles and continuous processes are instead segmented into discrete steps; and the compartmentalization bias, in which conceptual elements that are in reality highly interdependent are instead treated in isolation, missing important aspects of their interaction "
Cognitive Flexibility, Constructivism, and Hypertext, R.Spiro et al. http://phoenix.sce.fct.unl.pt/simposio/Rand_Spiro.htm —The preceding unsigned comment was added by Difficult to pick a user name (talk • contribs) 18:59, 29 March 2007 (UTC).
Loss aversion, endowment effect, and status-quo bias
"Anomalies: The Endowment Effect, Loss Aversion, and Status Quo Bias" Daniel Kahneman; Jack L. Knetsch; Richard H. Thaler The Journal of Economic Perspectives, Vol. 5, No. 1. (Winter, 1991), pp. 193-206.
The first two paragraphs of this article:
- A wine-loving economist we know purchased some nice Bordeaux wines years ago at low prices. The wines have greatly appreciated in value, so that a bottle that cost only $10 when purchased would now fetch $200 at auction. This economist now drinks some of this wine occasionally, but would neither be willing to sell the wine at the auction price nor buy an additional bottle at that price.
- Thaler (1980) called this pattern-the fact that people often demand much more to give up an object than they would be willing to pay to acquire it-the endowment efect. The example also illustrates what Samuelson and Zeckhauser (1988) call a :status quo bias, a preference for the current state that biases the economist against both buying and selling his wine. These anomalies are a manifestation of an asymmetry of value that Kahneman and Tversky (1984) call loss aversion-the disutility of giving up an object is greater that the utility associated with acquiring it.
I think it's very clear that my edit (suggesting that the three are related) is consistent with the conventional usage in economics. Anthon.Eff 22:24, 11 April 2007 (UTC)
- Look, armies of academics have worked on those "asymmetric" effects. Your citations do not support your point as they show clearly that the three phenomena are different manifestations of something more general, but more vague, an "asymmetry". How can, for example, an investor understand his/her biases if he/she sees a confusing explanation? Please, treat each bias for itself, show the analogies if you like, but also the differences. Show why they are not called by the same name, this is the way to make things clear. In asset management the differences are as follows
- Loss aversion is clearly the reluctance to sell when the priced is lower.
- Endowment effect is clearly the idea that what somebody owns has more value that the market offer even if its market price already got multiplied two, ten or a hundred times.
- Status quo bias does not concern only selling but also buying. It is the reluctance to change one's price estimate (whether upwards or downwards), as well as to make arbitrages between assets.
- The purpose of a list is to list things, not to merge them. I plan to eliminate again your misleading add on, to avoid confusion in the mind of readers (yes, framing, which is the way things are presented and perceived, is another cognitive bias). But I'm sure it would not be needed, as I trust you will dig deeper into those phenomena and will refine your wordings (the word "related" is seriously misleading as it hides the - crucial - differences) to avoid those confusions. --Pgreenfinch 07:10, 12 April 2007 (UTC)
- I think you need to give a citation for your definitions. I provided a citation, written by three well-known economists, including Thaler (who invented the term endowment effect), and Kahneman (who with Tversky invented the term loss aversion). It would help, when you provide your citation, to also do as I did: actually extract a few paragraphs providing the definitions that support your point. Until then, I think you should avoid reverting, since you have provided no evidence that you are correct. Anthon.Eff 11:54, 12 April 2007 (UTC)
- If I understood well your only knowledge of the subject is a few old citations and you do not really want to explore the topic by yourself. Sorry, but you took the responsability to make an add on (which btw does not match clearly those citations) and to try to play the professor on something you do not really grasp. So you have to take that responsibility until the end and explain really what your word "related" really means. Maybe you can look at Martin's Sewell's treasure chest on the topic, a site I usually recommends, there you will learn a lot. You know, I respect the fact that you are an expert in philosophy, but thoses topics are very practical and precise ones and should be approached practically and precisely, avoiding reductionism. A philosophical concept, this word, seems to me, although here I'm not an expert and will not meddle in the article ;-). --Pgreenfinch 13:10, 12 April 2007 (UTC)
- Actually, I'm an academic economist, though behavioral economics is not my primary area of expertise. But to be honest, I think you are the one trying to "play the professor", as you put it, by presenting your own definitions as if these are somehow authoritative. That's a nice approach when writing ones own papers or playing on a blog, but that's not what is expected on Wikipedia (see WP:Attribution or WP:No original research). --Anthon.Eff 13:49, 12 April 2007 (UTC)
- I didn't give my page as reference for those concepts (although, by the way, various academics and various professional institutions link to that page and "play" with that forum). I just summed up here the usual definitions - which I didn't invented - of those effects. You are free to find better ones in the litterature, which is why, to help you clarify (what does "related" means?) your (original) add on I suggested you to get more expertise via a recognised academic portal on those topics. If you do not give clear definitions of those phenomena (I don't specially ask to follow those I gave, if you find more explicit ones) that show how close / how far their relations are, how can you write that they are related, a very vague word? --Pgreenfinch 17:43, 12 April 2007 (UTC)
- I guess this is getting pretty unproductive. I did not use the word "related" in the article. I used the words "see also." And I have already provided a citation--from two of the men who coined the terms "loss aversion" and "endowment effect." Since we're having such trouble over this issue, I decided to simply quote these two men in the article for the definition of these terms. I have, of course, provided the source for these quotes. I trust that letting the authorities speak for themselves now resolves any differences we may have. --Anthon.Eff 18:47, 12 April 2007 (UTC)
- I see your point, an author talked about different things which are about 10% related and this gives an authority argument to put "see also" between those. Status quo bias has very few to do with loss aversion, which is why your citations are not really explicit about that. I find that a poor way to use a Nobel prize author. Now that you opened the pandora box, we could add (we just need to find an author that talked about those different things) "see also" between scores of cognitive, emotional, individual, collective biases that have a few common points between them, and make a complete mess of the article. Just an example, status quo bias could be linked, by stretching the rubber band, to cognitive dissonance, cognitive overload, plain laziness, rationalization, overconfidence and many other things, nearly the whole list of biases. I have nothing against the "see also" tool, but on condition that it explains the similarities as well as the differences. With a "see also" signpost and no clear explanations, it is up to the reader to scratch its head wondering "Boy, what does this is supposed to tell me? Do those concepts complement or oppose each other and why? Do those roads converge or diverge and where?". I think we have here now a serious dent in an article which had the potential to be an example of encyclopedic quality, --Pgreenfinch 07:09, 13 April 2007 (UTC)
- The article contains 10 cases of "see also" in parentheses--all but one were there before I made my edit. Not one of the nine preexisting cases has the level of editorial detail you think is mandatory. By the way, note how this "serious dent in the article" only became salient after a dispute with another editor. What kind of bias do you think we are witnessing? My guess would be that this most closely matches Confirmation bias, in that we have seen one thing after another brought up in order to validate the initial negative reaction to the other editor's edit (first wrong definitions, then a few ad hominems, then the use of the word "related", then use of the words "see also"). But it could also be seen as an example of the Focusing effect, since you appear to think one minor edit has the potential to wreck the entire article. But then again, your commitment to the current state of the article may be an example of Status quo bias. Of course, as Pronin, Ross and Gilovich (2004: 781) tell us: "people readily detect or infer a wide variety of biases in others while denying such biases in themselves." An unnamed bias of bias attribution (someone should name this and put it on the list), which has almost certainly caused me to overlook my own biases. Any lurkers out there? What kinds of biases do you think sustain such a long and unproductive discussion over a minor edit? --Anthon.Eff 13:38, 13 April 2007 (UTC)
- Pronin, Emily, Lee Ross, and Thomas Gilovich. (2004). "Objectivity in the Eye of the Beholder: Divergent Perceptions of Bias in Self Versus Others." Psychological Review. 111(3): 781–799.
- Anthon, you are becoming a real shrink, see all those biases you discovered in me, I'm impressed :-). Btw, the other "see also" do not refer to a specific bias in the list. More generally, "see also" are understandable in an article on a precise topic, as the relations can be seen clearly from the text. But in a list there is no real room to elaborate, either it becomes a mess by trying to explain any relation, however strong or weak, and if those explanations are skipped, the "reductionist" risk is high. An alternative is to make smaller groupings, but here overlaps would be frequent and it would be subjective to find clear category criteria in a field that deals with human behavior. Thus we would face the same risks. Now I expect like you that lurkers will describe fully our repective biases. Certainly, from what I just said, they will find me overly "risk averse" ;-). --Pgreenfinch 15:14, 13 April 2007 (UTC)
Great List - Inaccurate quality/importance tag?
This article is one of the best pages on the whole of the Internet. It strikes me as bordering on presumptuous for someone to come along ang lightly tag it as low-quality and unimportant. --New Thought 16:36, 12 January 2007 (UTC)
- I agree. This is a fantastic list. As a research, it was a big help to me. What would make it even more powerful is if the article citations for each bias (if available) were noted in the footnotes on this page, rather than just on the individual pages of the biases. Zfranco 00:25, 8 February 2007 (UTC)
- Don't repeat yourself. --Gwern (contribs) 03:15 8 February 2007 (GMT)
- This list is just great. It is not perfect as it is, but nowhere else in my life I´ve seen a list of cognitive biases, which affect so much people´s decisions. Unfortunately most people don´t think too much about them. The world would certainly be a better place if everyone was aware of cognitive biases and thought more about the rationality in their decisions. I don´t know how to express in Wikipedia that an article is important to me. I guess I am going to write it on my user page. Congratulations to all editors who made this list possible. A.Z. 00:16, 12 March 2007 (UTC)
- Indeed, this is a highly useful list and is more comprehensive than anything I have been able to find on the internet on the subject. Moreover, it is a great use of the Wiki approach of synthesizing information from so many different sources. As the article currently stands, the writing is sound, the topic important, and I suggest the tags be removed or at least be revised to reflect consensus. In that respect, I have moved the various comments on the quality of the page to this new section, for convenience of others in expressing their views. McIlwrath 20:50, 12 April 2007 (UTC)
Observer-expectancy effect
Do you have a source saying that Observer-expectancy effect is a cognitive bias ? I have added a {{citation needed}} in the Observer-expectancy effect article. Thank you. Akkeron 10:44, 15 April 2007 (UTC)
Déformation professionnelle
My observation is that this bias is detected by folks outside the discipline/profession, not within it.204.87.68.252 20:50, 24 April 2007 (UTC)
Dunning-Kruger effect
I found this, which looks to me like a cognitive bias, but I'm not sure which category to put it in.
Dunning-Kruger effect Saraid 02:10, 22 July 2007 (UTC)
- This was just removed, in this edit, but I don't understand why. "original research" how? CRETOG8(t/c) 23:28, 17 September 2009 (UTC)
- The term "Dunning-Kruger effect" is original research by Wikipedia: it doesn't come from published sources. Hence putting it on a list alongside things like confirmation bias and overconfidence, which are well-established in literature, is misleading. The article itself needs to be renamed or merged. In the meantime, there's no reason to have the term on this list. The Dunning-Kruger experiments are mentioned in Illusory superiority. The academic literature doesn't treat their research as a separate effect, but mentions it in connection with illusory superiory/better-than-average effect/self-enhancement bias/whatever you want to call it. MartinPoulter (talk) 08:36, 18 September 2009 (UTC)
- The term shows up in several places by a standard Google search and a couple (not clearly published) in a Google Scholar search. So, it's a term in use, if not in heavy use, so it's not OR. I don't know how to rule out circularity here, that quite possibly some of the search results use the term because they learned it from WP, but anyway... I'm also not sure it deserves its own article (I lean slightly toward yes), but that's a different question.
- Aside from this disagreement, I appreciate the cleanup you did on this list. CRETOG8(t/c) 13:57, 18 September 2009 (UTC)
- Actually, following a few of those search links, it really does look like they're getting the term from WP. It's awkward, but I think you're right. A priority then should be getting rid of the article itself and merging it into illusory superiority. CRETOG8(t/c) 14:05, 18 September 2009 (UTC)
- Glad you understand, and thanks for the compliment. I don't think it justifies having its own article yet, but I'm very open to debate on that issue. MartinPoulter (talk) 11:34, 19 September 2009 (UTC)
- Actually, following a few of those search links, it really does look like they're getting the term from WP. It's awkward, but I think you're right. A priority then should be getting rid of the article itself and merging it into illusory superiority. CRETOG8(t/c) 14:05, 18 September 2009 (UTC)
- The term "Dunning-Kruger effect" is original research by Wikipedia: it doesn't come from published sources. Hence putting it on a list alongside things like confirmation bias and overconfidence, which are well-established in literature, is misleading. The article itself needs to be renamed or merged. In the meantime, there's no reason to have the term on this list. The Dunning-Kruger experiments are mentioned in Illusory superiority. The academic literature doesn't treat their research as a separate effect, but mentions it in connection with illusory superiory/better-than-average effect/self-enhancement bias/whatever you want to call it. MartinPoulter (talk) 08:36, 18 September 2009 (UTC)
Perceptual vs. Conceptual
Perhaps the page should say "distortion in the way humans conceive reality" instead of "distortion in the way humans perceive reality" since the perception is the same, it's the concept that's distorted.
More general type of Post-purchase rationalization bias?
[moved from Talk:Post-purchase_rationalization ] Hi, I've been searching List_of_cognitive_biases for a type of bias and I think Post-purchase rationalization is closest to it, but it's more general than purchasing, where a decision has been made, then vacuous justifications are made up afterwards to support the decision, and justifications for alternative outcomes are deemphasized. Is this a separate type of bias that has its own name? 86.14.228.16 12:48, 24 July 2007 (UTC)
Illusion of control
The 'Illusion of Control' entry in the list contains has a negative bias. I propose 'clearly cannot' be changed to 'scientifically have no control over'. - Shax —Preceding unsigned comment added by 82.17.69.223 (talk) 01:13, 2 September 2007 (UTC)
Argument from incredulity
Argument_from_ignorance (or incredulity) is listed as a "logical fallacy" but seems like it belongs in this list. Whether one makes the logical argument explicitly or not doesn't really matter; it still is a cognitive bias.
Is this list (and the article on Cognitive Bias) confusing Cognitive Bias with simple Irrational Thinking?
On reading the article and the discussions here, I wonder where the "authority" comes from to classify this list as comprising genuine cognitive biases. At first I wondered, as others have, about grouping and classifying the biases, but the more I read the more the term Cognitive Bias seemed to be becoming used as a catch-all for any illogical thought process. There seems to be a sort of populist psychology creeping in here. I am not sure that this is correct but IMHO a cognitive bias is the sort of phenomenom noted in Prospect theory. Whilst it is true that, using mathematical formulae, the human response is inconsistent and illogical, at the same time it is also clear, from a human perspective, why the bias exists and indeed its utility in the survival process. On the other hand simple illogical thinking can be termed "bad" thinking e.g.: "I can jump that gap because he could and I am as good a person as he is.". There is a confusion in the example between being goo (at jumping) and being good (as a person). Prospect theory, on the other hand, has more in common with valuing a "bird in the hand more than two in the bush.". Simple illogical thinking is clearly not useful for survival and can even at times be described as a symptom of mental illness. However Cognitive bias can IMHO be seen as a "real" response to a probablistic reality. If a coin is tossed and it has come up heads four times it is illogical to prefer to choose tails the next toss. However, IMHO it is eminently sensible so to do. It all depends on HOW we frame the maths. Could we do some "disambiguation" here into "true" cognitive biases, and the examples of simple "bad"/illogical thinking? Also, I have been unable to find (quickly) any references to "lists" of cognitive biases, but only to particular studies showing an example of a cognitive bias, such as that described by Prospect Theory, Priming, Framing etc.
LookingGlass (talk) 16:55, 18 February 2008 (UTC)
- I agree that this list is leaking beyond cognitive biases into other kinds of bias. I think the way around this is to focus on books which are unambiguously about cognitive bias (there are examples in the article), and count something as a cognitive bias if its described in those. MartinPoulter (talk) 18:30, 4 August 2009 (UTC)
Tags
Did somebody discover the secret of time travel? Whoever marked "December 2008", please correct this. Otherwise, it will have to be removed. Montblanc2000 (talk) 18:02, 29 August 2008 (UTC)
- Removed tag: This article or section has multiple issues. The page is, while perhaps not perfect, fine. Tagging it is unreasonable. Power.corrupts (talk) 07:31, 14 September 2008 (UTC)
Precision bias
I have just discovered Precision bias. Where in this list shoould it be? -- Wavelength (talk) 17:30, 22 July 2009 (UTC)
- Good catch. I'd put it in List_of_cognitive_biases#Biases_in_probability_and_belief. CRETOG8(t/c) 17:36, 22 July 2009 (UTC)
- I don't think this is a cognitive bias. Unless we can find cognitive psychology literature describing this bias, it shouldn't be in this article. MartinPoulter (talk) 18:32, 4 August 2009 (UTC)
Pro-innovation bias
I have just discovered Pro-innovation bias. Where should it be listed? -- Wavelength (talk) 20:53, 22 July 2009 (UTC)
- Same objection as for "Precision bias" above. It is mentioned in academic literature, but I don't see an argument that it's a cognitive bias rather than just a fallacy. MartinPoulter (talk) 18:33, 4 August 2009 (UTC)
inequality aversion
It's difficult to make a clear line between "biases" and "preferences", but I'd say inequality aversion falls pretty clearly into the preference category. Some people don't like inequality, that isn't a bias in any standard sense--it wouldn't be classified as "irrational" by an economist or decision theorist. CRETOG8(t/c) 19:36, 23 July 2009 (UTC)
- Actually, sacrificing personal gain to prevent an "unfair" advantage by others is a classic example of irrational behavior in economic study. For example, if you and a coworker both earn $500 a day. Then, you have the option to accept a $100 raise on the condition that your coworker also receives a $300 raise. According to Inequality Aversion, many people will refuse, even though they would themselves earn more money than they were before. Rational economic actors disregard economic inequality and only seek to improve their own individual condition. 74.103.137.193 (talk) 13:37, 20 October 2009 (UTC)
Cleanup for unsourced/redlinks and article organization
The page needs some cleanup with regard to redlinked and related terms. Redlinks need either a source showing the acceptance of the term within the psychology community or an article that is similarly sourced. If there is no reason to believe that a particular bias is anything more than a pet project, it should be removed. Additionally, it would behoove us to group related biases (such as confirmation bias, experimenter's bias, hindsight bias, et cetera) rather than use the broad categories we currently employ. 128.61.23.121 (talk) 19:03, 25 August 2009 (UTC)
Narrative
This page links "black swan", but why does it not list the narrative fallacy? Cesiumfrog (talk) 04:15, 8 October 2009 (UTC)
Esperanto Version
How long should I wait for someone else to publish the Esperanto version before I try to do it myself? Please advise. -Joshua Clement Broyles —Preceding unsigned comment added by 190.24.75.47 (talk) 02:21, 6 November 2009 (UTC)
What about the effect of power on moral hypocrisy?
This recent research Why Powerful People -- Many of Whom Take a Moral High Ground -- Don't Practice What They Preach (also discussed in this Economist article Absolutely: Power corrupts, but it corrupts only those who think they deserve it) demonstrates that "power and influence can cause a severe disconnect between public judgment and private behavior, and as a result, the powerful are stricter in their judgment of others while being more lenient toward their own actions."
None of the biases in the list seems to cover this one. The ones that came closest were some of the social biases, such as illusory bias. Before I added it to the list, I wanted to check to see if it was already covered by one of the listed biases. --Nick (talk) 09:25, 11 February 2010 (UTC)
Post-decission bias after irreversible elective surgery?
Hi, looking for literature regarding this effect and especially the likely bias it introduces into patient satisfaction surveys and such. I found quite a few somewhat related topics but nothing that would match the situation very well - is there anything? Richiez (talk) 23:52, 29 April 2010 (UTC)
"Early information is better" bias?
Hi,
Can't find it in the list and did not do the research myself but i'm convinced (by introspection, why do i value this political argument more than the other) there is another bias: whatever you hear first is "more true" than what you hear later, even when the sources of information have the same credibility. The experiment would be something like: mum tells a kid that people like blue more than any other color. A month later dad says red is the most favorite color. My guess is that the kid will insist that it's blue, even when the two sources of information are just as credible and there is not obvious reason why any of the colors should be "better". (Of course, swapping colors and parents should be part of the experiment). The reason for this phenomenon could be that people create a point of vision based on facts they heard before, and it's just more work to change your vision than to deny new facts that don't fit in it.
Joepnl (talk) 17:29, 1 August 2009 (UTC)
- This would come under Confirmation bias. MartinPoulter (talk) 18:42, 1 August 2009 (UTC)
- Thanks! Joepnl (talk) 00:03, 2 August 2009 (UTC)
How about this also (from the article):
- Primacy effect — the tendency to weigh initial events more than subsequent events.
- In my learning of psychology (I did half a degree in it), primacy effect refers to a stronger memory for early events than subsequent events, and the linked article on serial position effects seems to be about this. Joepnl's question was about opinions based on early evidence being defended against subsequent opposing evidence. That's why I recommended confirmation bias rather than primacy effect. MartinPoulter (talk) 18:25, 4 August 2009 (UTC)
- Thank you both, psychology really is interesting Joepnl (talk) 01:22, 2 May 2010 (UTC)
Christian McClellan spam
This guy "Christian McClellan" is a real villain. He has spammed the entry for "Bias blind spot" several times. I suggest a permaban. 70.88.114.161 (talk) 21:12, 16 June 2010 (UTC)
Realism Theory?
I believe that "realism theory" has another name. Does anyone know what it is? 9:08, 21 Nov 2005
Don't know if helpful, but I am looking strongly into developing the subject of "bias". 10-25-10 —Preceding unsigned comment added by 68.184.170.80 (talk) 21:22, 25 October 2010 (UTC)
Bystander effect
This doesn't seem to me to be a cognitive bias. It is discussed in a lot of the same sources that discuss cognitive biases, but that doesn't make it one. An effect is not the same as a bias that causes an effect. MartinPoulter (talk) 14:02, 29 June 2010 (UTC)
- You are quite right! (I did not think of it when I corrected the edit.) Lova Falk talk 17:41, 29 June 2010 (UTC)
- A cognitive bias that can lead to the bystander effect is called pluralistic ignorance. In a helping situation, it is the bias to assume that no one thinks an event is an emergency because no one is acting like it is. The trouble is that people tend to freeze when something unexpected happens. This can cause everyone to look around, see that others aren't doing anything, and therefore choose not to act themselves. It would be a type of attribution bias because people are assuming that others' behavior can be explained by their internal states (their beliefs) when in reality it is aspects of the situation that are impacting behavior. Osubuckeyeguy (talk) 18:09, 29 June 2010 (UTC)
Social Biases are not a type of Cognitive Bias
Cognitive Biases are one thing. Social Biases are another. Social Biases should be split to make a new article, "List of Social Biases."
- I disagree. Social biases stem from the way that people think (cogitate) about social interactions. At the very least, make some sort of argument for your assertion.
--NcLean 8th of October 2006
Valence effects
Optimism bias and Valence effects have separate articles, but what's the difference? Also, are there known systematic links between optimism effects (Rosy retrospective, valence effect, planning fallacy, overconfidence effect, false-consensus...) --NcLean 8th of October 2006
Comfort and Implications effects
Where does one's desire for comfort at the expense of something he knows to be more beneficial fit into this list of biases? In other words, a major obstacle to clarity is the human predisposition to adopt and maintain beliefs which are comfortable for us as opposed to true. Perhaps related, or not, is the skewing of one's perception due to the implications of his/her choices. For instance, if I decide that the right thing to do is to help my wife deal with a sick child, then I will have to stop surfing the web, which gives me more immediate pleasure. [Helping my wife and child is a deeper pleasure, but not immediately gratifying]
- Directly related to 'comfort bias' is a bias which isn't listed yet strikes me as one of the most significant of all: believing what you want to believe. Athiests argue that people believe in God and heaven because people very much want to believe in the afterlife. This bias is tremendous in the effect it has had on mankind. Where is it listed? Simon and Garfunkel wrote a song that had a line about this bias.
Bandwagon effect
I don't see a good psychology citation for this one either. Or lots of the other ones. That's why it's B-grade in my view. There are other pages with better citations - and good books (cited in the article). But this article is not a bad starting point. —The preceding unsigned comment was added by 74.95.10.169 (talk) July 16, 2007
New topic
There is a recent (or recently recreated) article over at Semmelweis reflex. Do ya'll think that is a neologism for one of the biases already on this list (perhaps a version of confirmation/disconfirmation bias), or does it deserve a place of its own? - KSchutte (talk)
Does this article need renaming?
Improvement of this article is hampered by the fact that there doesn't seem to be any scholarly source which gives a definitive list of cognitive biases: a situation very different from, say, logical fallacies. If anyone knows of such a source, I'd love to know about it. What we do have is a set of books and papers that can be identified as cognitive psychology or social psychology, and which between them describe biases in judgment and decision making.
Not only is there not a canonical list, the term "cognitive bias" seems misleading. It implies a bias in judgement or decision-making that is due to heuristics or passive reasoning errors, as opposed to motivational effects such as cognitive dissonance. A few decades ago when the "heuristics and biases" and "hot cognition" areas of psychology were more separate, there was a clearer distinction between cognitive biases and some of the other biases on this list. Since the early 1990s, it has been recognised that a lot of biases have both motivational and cognitive components. Explanations of confirmation bias or hindsight bias, for example, include both cognitive and motivational effects. Hence I question whether it's useful, or fair to the sources, to attempt to demarcate the cognitive biases. This is an issue which has come up before: [1]
Something like this list needs to exist, and in fact should be a high priority given the huge amount of research on biases and the great public interest in the topic. It would be destructive to pare it down to biases that are uncontroversially "cognitive". It seems like List of biases in judgment and decision making would be a less presumptive/ less technical alternative. Look at the titles of two of the key textbooks: The Psychology of Judgment and Decision Making; Thinking and Deciding.
We still need a way to hold back the tide of unwanted additions (effects related to psychopathology or psychotherapy; logical fallacies; behavioural effects such as Bystander effect) but a well-worded lede and some in-article comments could achieve that. MartinPoulter (talk) 16:53, 30 July 2010 (UTC)
- Yes. Not only are their plenty of books on the subject which use the term "judgment and decision making", there is even a Society of that name! Fainites barleyscribs 10:47, 31 July 2010 (UTC)
- Just backing up Fainites' point, here are more textbooks that should be crucial sources for this list: Rational Choice in an Uncertain World: The Psychology of Judgment and Decision Making, Judgment and decision making: psychological perspectives, Blackwell handbook of judgment and decision making. On the other side (trying to avoid confirmation bias), one of the key textbooks is Rudiger Pohl's "Cognitive Illusions". However, even that book has a subtitle of "A handbook on fallacies and biases in thinking, judgement and memory." MartinPoulter (talk) 12:24, 31 July 2010 (UTC)
- I agree on the renaming and I think that unwanted additions might be taken care of by explicit categorizing and introductory remarks to the list. As to the literature, these (unmentioned in the articles) might be of central interest: [http://www.amazon.com/Hypothetical-Thinking-Processes-Reasoning-Psychology/dp/1841696609/ref=sr_1_7?s=books&ie=UTF8&qid=1280833569&sr=1-7 Hypothetical Thinking: Dual Processes in Reasoning and Judgement (Essays in Cognitive Psychology) by Jonathan Evans] and an earlier book by the same author [http://www.amazon.com/Bias-Human-Reasoning-Consequences-Psychology/dp/0863771564/ Bias in Human Reasoning: Causes and Consequences (Essays in Cognitive Psychology)]. Best, --Morton Shumway (talk) 11:20, 3 August 2010 (UTC).
- It does seem that the topics of logical fallacy and cognitive bias have been mashed together with neurological biases. I'm leaning toward renaming and or splitting. Comment: WikiPedia says "Pareidolia – a vague and random stimulus (often an image or sound) is perceived as significant, e.g., seeing images of animals or faces in clouds, the man in the moon, and hearing hidden messages on records played in reverse." However, Apophenia, the ability to pull meaningful data out of what is essentially noise, is likely the basis of Pareidolia.[[2]] To quote: "Apophenia is the tendency to find meaning in noise. This is how we see constellations in the stars, faces on Mars, and Kaziklu Bey’s visage grinning evilly from a slice of cinnamon toast." - Dreadfullyboring (talk) 14:53, 19 December 2010 (UTC)
- I agree on the renaming and I think that unwanted additions might be taken care of by explicit categorizing and introductory remarks to the list. As to the literature, these (unmentioned in the articles) might be of central interest: [http://www.amazon.com/Hypothetical-Thinking-Processes-Reasoning-Psychology/dp/1841696609/ref=sr_1_7?s=books&ie=UTF8&qid=1280833569&sr=1-7 Hypothetical Thinking: Dual Processes in Reasoning and Judgement (Essays in Cognitive Psychology) by Jonathan Evans] and an earlier book by the same author [http://www.amazon.com/Bias-Human-Reasoning-Consequences-Psychology/dp/0863771564/ Bias in Human Reasoning: Causes and Consequences (Essays in Cognitive Psychology)]. Best, --Morton Shumway (talk) 11:20, 3 August 2010 (UTC).
Hostile media effect
Although an anonymous user deleted it, this does seem to be a bias of the same kind as the rest in the list. It's a bias in human information processing, discovered by researchers who investigated other biases in the list, and published about in the same journals and books. I'm going to restore it to the article. MartinPoulter (talk) 10:24, 9 January 2011 (UTC)
Good decisions - following decision theory
I'm not sure where it belongs in the Cognitive biases contents,however, I would like to see a definition in the Tversky/Kahneman sense, following decision theory, what a good decision is. However, since the study focuses on the opposites or fallacies, what a 'poor decision' is. 95.34.138.148 (talk) 17:20, 1 February 2011 (UTC) Fjeld
Forer Effect
Is the example given for Forer Effect, "horoscopes", possibly a little insensitive to the believers in astrology and numerology? It seems a little dismissive while there are more innocuous ways to frame the concept and this just seems to pick on a less prominent belief system. To illustrate: I wouldn't want to conclude with "...a proselytizer's sermon," which could be said to have this very effect on the searching listener, either. Similarly, consumers are moved by targeted marketing recruitment commercials for entry-level positions as though their strengths are being called upon specifically.
I was thinking perhaps "For example, a fortune cookie" would be less offensive while illustrating the concept fully. Stopblaming (talk) 21:25, 18 February 2011 (UTC)
- The experiment which demonstrated the Forer effect specifically used horoscope material as its source, and the reliable sources which discuss the effect mention it in conjunction with horoscopes. Wikipedia is not censored, and we should not be removing or replacing factual, verifiable information in an article for fear that readers might be "offended" by it. A fortune cookie is not an appropriate example because it is not presented as an individual description of the reader's personality. It would be misleading to readers about what the Forer effect actually is. MartinPoulter (talk) 13:04, 19 February 2011 (UTC)
- Just to emphasise, "people might be offended" is never, of itself, a reason to remove or distort information on Wikipedia. MartinPoulter (talk) 13:17, 19 February 2011 (UTC)
"More footnotes" tag - should this be removed?
There's a "more footnotes" tag at the head of the article, and some "citation needed" tags in the text. However, as this article is essentially a list and the large majority of entries are linked to their own, usually well-referenced Wikipedia articles, is it really necessary for them to have additional citations here? It seems to me these tags can be removed.113.73.77.60 (talk) 01:42, 13 March 2011 (UTC)
Capability Bias
Every single description that I've been able to locate on the web defines the capablity bias as;
"the tendency to believe that the closer average performance is to a target, the tighter the distribution of the data set"
And by "every", I truly do mean every single one of them. I can find no alternate definitions or clarifications, and every instance of the capability bias listed on the web (that I could find) has been copypasta'd in the same large list of cognitive biases.
I started looking because the above definition doesn't make much sense to me - the grammar and punctuation leaves it with an ambiguous definition to me. I cannot think of a reasonable instance where I would use the term "capability bias" as a label for something which has happened.
The only further information I could find was in the wikibin - this article once did exist but has since been omitted. The only additional information listed there, not found anywhere else, is;
"Long known, but recently codified bias. Based on observations by Daryl Clements, and Steve Hajec in their many years working with executives in the area of Business Excelence."
However, a search on Clements and Hajec (including a search of scholarly articles) came up empty. I can't even find any evidence of a Daryl Clements or a Steven Hajec of ever having published anything scholarly - not together, and not even on their own. I'm not convinced this bias is recognized by scholars in the field, and it may even be something fake that suffered from truth-by-default due to copypasta. I don't think that "citation needed" suffices in this case. —Preceding unsigned comment added by 74.198.165.50 (talk) 21:19, 8 April 2011 (UTC)
- I agree. 1,040,000 results for the copypasta. 4,000 results for "capability bias" without the copypasta definition, and those are largely coincidental occurrences of the two words (and a couple mirrors of the deleted Capability Bias article). If it can't be sourced, it should be deleted.70.36.142.92 (talk) 21:25, 27 April 2011 (UTC)
- This list is long enough without questionable entries, so I have simply removed it. Hans Adler 22:10, 27 April 2011 (UTC)
- Thanks all for doing this. Nice detective work by the anonymous author. MartinPoulter (talk) 15:11, 28 April 2011 (UTC)
- This list is long enough without questionable entries, so I have simply removed it. Hans Adler 22:10, 27 April 2011 (UTC)
Possible clarifications/ addtions
I'm a relative newbie to the formal study of psychological heuristics and cognitive biases. In perusing this list, I wondered some things:
a. Would "Stereotyping" be enhanced or clarified by the addition of, "(regardless of whether the stereotype may be probabilistically "true.") Or, further and in general, attempting to apply general probabilities that are relevant in a situation to a particular instance." Or is the latter a different bias altogether? (Granted, it is the misapplication of probability theory ("Neglect of probability" bias?), or simply ill-education. cf. "Misuse of Statistics.")
b. Is the "Just-world phenomenon" (Social Biases) the same as the "Just-world hypothesis" (Biases in Probability and Belief)? If so, should they be combined? If not, why are they different? Should they reference each other?
c. Would "System justification" benefit from the addition of, "a.k.a. that's-the-way-it's-always-been or we-don't-do-it-that-way-here" for a wider audience?
d. Where does the NIMBY syndrome fit? (NIMBY: "Not In My Back Yard.") Is it a cognitive bias?
Thanks. Spartan26 (talk) 02:30, 14 October 2011 (UTC)
It has been suggested that List of memory biases be merged into this article or section.
- Support - makes perfect sense, can't think why this has been outstanding since August! Pesky (talk …stalk!) 13:41, 25 November 2011 (UTC)
Yes, makes sense top merge different groups, and I will start with some of this work... first I think it is useful to merge
"Decision-making and behavioral biases: Many of these biases are studied for how they affect belief formation, business decisions, and scientific research"
and
"Biases in probability and belief: Many of these biases are often studied for how they affect business and economic decisions and how they affect experimental research"
which is really arbitrary... --InfoCmplx (talk) 20:04, 14 January 2012 (UTC)
Where does "repetition bias" fall?
This is a common mistake where we select for the answer that we've heard more often, regardless of how much or little sense it makes. This might also be considered a "strong meme" bias, where ideas that are better at getting repeated (are more "memetically fit") are given prevalence over others on the presumption that passing through more minds has provided a stronger filter for bad ideas.
- I was taught this was a "perseverance error". I've not heard it described as a bias. Or you might be talking about the availability heuristic. I read a lot of bias literature and I've not encountered "repetition bias" as a term in itself. Willing to be pointed to new sources, though. MartinPoulter (talk) 18:21, 2 March 2012 (UTC)
Doubt avoidance?
Should "doubt avoidance" perhaps be on this list? Or is it already covered under another category? E.g. the tendency of humans to want to 'fill in' information that isn't available. Cf. http://www.thinkmentalmodels.com/page66/page90/page90.html "This model is taken from 'Poor Charlie's Almanack', by Charles Munger: The brain of man is programmed with a tendency to quickly remove doubt by reaching some decision. It is easy to see how evolution would make animals, over the eons, drift toward such quick elimination of doubt. After all, the one thing that is surely counterproductive for a prey animal that is threatened by a predator is to take a long time in deciding what to do. And so man's Doubt-Avoidance Tendency is quite consistent with the history of his ancient, non-human ancestors. So pronounced is the tendency in man to quickly remove doubt by reaching some decision that behavior to counter the tendency is required from judges and jurors. Here, delay before decision making is forced. And one is required to so comport himself, prior to conclusion time, so that he is wearing a 'mask' of objectivity." 196.215.201.28 (talk) 17:19, 27 March 2012 (UTC)
renaming of this page
There is an existing consensus that the title of this list is inappropriate and misleading. I'm going to go ahead and rename. MartinPoulter (talk) 14:28, 13 October 2012 (UTC)
I'm concerned about the move
The new name is too long, and reflects an academic divide (and not all of it, at that) which is unimportant to the article topic. IMHO, the word cognitive as a mashup is precisely the correct, universally understood umbrella term into which the list should be poured, sieved internally. It might be important to decide which is most important for this article, and which should go in another article. We have neurologists, psychologists, and all manner of combo-disciplines employing ever finer microtomes of linguistic point-shavery just to be pointy. Should we really be playing into that game, even as some of the research and claims (as we speak) are being thrown out as fraudulent, irreproducible claptrap? Help me out here. I just don't think dragging the reader down the rathole of indistinguishable distinctions without differences is beneficial. Brevity and clarity should be more important than that. --Lexein (talk) 18:13, 13 October 2012 (UTC)
- Speaking of brevity and clarity, can you set out the content of your last few sentences in a more succinct way? And what does "an academic divide which is unimportant to the article topic" refer to? I dispute that "cognitive" is either correct (it's POV) or universally understood. Why do you think the Wikipedia article should differ from the books on which it's based. Help me understand what your objections are. MartinPoulter (talk) 21:21, 13 October 2012 (UTC)
- Wow. Well, no, I asked you first. You wrote "explanations of confirmation bias or hindsight bias, for example, include both cognitive and motivational effects." So, why not simply add the word "motivational" (or some other) in the title, rather than remove "cognitive"? Is removing one word, as opposed to adding another, the only solution? The so-titled journal Cognitive Psychology, doesn't seem to ban the word, and hasn't renamed itself Judgment and Decision-making Psychology or even Cognitive and Motivational Psychology, so why, precisely, is "cognitive" too POV to retain as part of the title? The name of the other Wikipedia article, upon which this list is based, is Cognitive bias - is that to be renamed? To what?
- This article is narrowly about observed biases, not the entirety of the containing field. Given that of all the titles of all the books offered as rationale, all describe a field: only one of those titles include the word "bias", and only along with "reasoning" and "cognitive". So, I assert that it is just as incorrect to entitle this list "judgment and decision-making biases" (if read too narrowly, an OR term), as it is to narrowly focus on only "cognitive biases". Some better naming solution exists, as yet unknown. --Lexein (talk) 23:11, 13 October 2012 (UTC)
- The sources don't use the phrase "cognitive and motivational bias".
- "This article is narrowly about observed biases" yes, but of what? Biases of electrical measurement by magnetic fields? Biases of statistical methods? What is being biased? In all these cases, the answer is either a judgment or a decision, so the present title of the article is appropriate. The journal Cognitive Psychology is about the discipline of cognitive psychology, so it's fairly named. It's fair for Wikipedia to have an article about cognitive bias, it's just that almost nothing on this list is uncontroversially a cognitive bias (and to imply they all are is unfortunately blatant POV), although it's not controversial to say they are biases in judgment and/or decision-making. If you're concerned about accessibility, I should point out that "cognitive" is not a common word that most English speakers would be expected to understand, but "judgment", "decision" and "making" should be relatively widely understood.
- What does "(if read too narrowly, an OR term)" mean? Thanks for writing in a more clear and understandable way. MartinPoulter (talk) 15:04, 19 October 2012 (UTC)
question about inline citations
Since this is a list type article, does every entry really need inline citations? I was under the assumption that lists were present for the purpose of aggregating information and giving a cursory overview. If the respective linked article is properly cited, do all the entries here need to be cited as well? It seems non-productive and only really adds clutter.Darqcyde (talk) 01:38, 19 October 2012 (UTC)
- I get your point, but the article has become something of a magnet for original research. Though in an ideal world I'd agree with you and we wouldn't require refs for each item, I think in the present case we need to ask for a citation that shows that each term is a bias within the scope of the article. I would hope that when fully developed, this list article will do what the best Wikipedia lists normally do which is give information about each item (e.g. when a term was first introduced): it's not just a navigational tool. If so, then there will have to be references for each fact. MartinPoulter (talk) 14:53, 19 October 2012 (UTC)
Baader-Meinhof phenomenon
Relevant discussion at | → Wikipedia:Articles for deletion/Baader-Meinhof phenomenon (2nd nomination) |
For Frequency illusion, the phrase Sometimes called "The Baader-Meinhof phenomenon" is currently {{fact}}-tagged[3]. This is a blog-propagated term unlikely to be found in any actual study, but as a popular online factoid it gets posted to Reddit about once a month[4].
This language could be changed to something like "popularly called Baader-Meinhof phenomenon", but there is some debate on what that term really describes from a scientific perspective (examples), so linking it to a scientific term might be WP:OR. Also, the term is unlikely to be used among people who know what the actual Baader-Meinhof is (such as Germans, or people educated in recent history).
So I figure List of cognitive biases should probably stick with terms used in relevant study, and popular neologisms can be omitted entirely. Does anyone disagree here? / edg ☺ ☭ 16:18, 9 October 2012 (UTC)
- I agree. We should use the scholarly terms primarily, mention popular synonyms where sources allow, and avoid neologisms. MartinPoulter (talk) 18:21, 9 October 2012 (UTC)
- Well, where does the term come from? Why is it popularly referred to as the Baader-Meinhof phenomenon? There is an article about the Baader-Meinhof group, what is its relationship with that?173.206.246.219 (talk) 03:08, 25 October 2012 (UTC)
Forer vs. Barnum Effect
Barnum effect redirects to Forer effect. The first sentence of the description is identical, while the second one is only in the description of the Forer effect.
Shouldn't one of the following actions be taken?
- Remove Barnum Effect from the list
- Use the whole description of Forer effect on Barnum effect
77.183.248.248 (talk) 15:18, 24 October 2012 (UTC)
- I've listed Barnum effect as an alternative name for Forer effect.
- There's an open question about Forer effect/subjective validation, too. See Talk:Forer effect#Something.27s_wrong.... —Mrwojo (talk) 20:41, 24 October 2012 (UTC)
Requested move
- The following discussion is an archived discussion of a requested move. Please do not modify it. Subsequent comments should be made in a new section on the talk page. Editors desiring to contest the closing decision should consider a move review. No further edits should be made to this section.
The result of the move request was: Move. Cúchullain t/c 15:30, 14 February 2013 (UTC)
List of biases in judgment and decision making → List of cognitive biases – This is a shorter, and much more popular name for this. --Relisted Tyrol5 [Talk] 02:21, 18 January 2013 (UTC) Greg Bard (talk) 08:38, 10 January 2013 (UTC)
- If you look further up this Talk page, you'll see why it was recently moved from "List of cognitive biases". You're right that "Cognitive biases" is more popular in the web as a whole, but it's not correct, and academic literature more often uses "in judgement and decision making". In fact, one of the concerns about the old name is that there does not seem to be any academic literature giving a "list of cognitive biases". Also, the "cognitive biases" title fails WP:NPOV, because "cognitive" is used in a lot of the literature for biases that arise from cognitive heuristics. Very few if any of the biases on this list are purely explainable in terms of heuristics. On the other hand, they are all (or nearly all) biases in judgement and/or decision-making. Another of the arguments made above is that "cognitive" is not a common word known to the typical Wikipedia reader, but "judgement" and "decision" are more likely to be. MartinPoulter (talk) 10:16, 18 January 2013 (UTC)
- Support: Martin may be perceiving a false-consensus effect, since he seems to be the one that previously moved the page and may now be engaging in irrational escalation. There seems to be an acknowledgement here that "cognitive bias" is the WP:COMMONNAME, and the claim that it is WP:NPOV does not seem correct to me. Rather, I think the claim is that the current name is more precisely accurate in the opinion of an academic expert, rather than that "cognitive bias" expresses some opinion. I don't exactly follow the rationale for saying that "cognitive bias" is insufficiently precise, since these phenomena are biases related to cognition. However, any lack of precision in the WP:COMMONNAME can presumably be addressed by adding some clarifying remarks within the article itself. As the policy says, the article name should not be pedantic, and "the term most typically used in reliable sources is preferred to technically correct but rarer forms". The more common term "cognitive bias" seems sufficiently disambiguating for Wikipedia purposes, less artificial, and more succint. —BarrelProof (talk) 22:01, 5 February 2013 (UTC)
- Support - I opposed the earlier rename, which was really pushed through by aggressive and vocal tag-teaming and shouting-down by two highly biased editors, against WP:COMMONNAME and against usual Wikipedia naming sensibility. COMMONNAME should obviously prevail here, with the inclusion in the lead paragraph explaining the lengthier usage in some (though assuredly not all) literature. I just let it go previously, knowing full well that it would eventually be reverted back. If there are subtopics which don't belong under the original title, they should go in a different article, rather than resorting to an unwieldy title which is, let's face it, hostile to the readership. We don't rename things because a few books or articles use a new fad name, we wait until the majority of the field settles on a new name, which, by the way, is usually similar in length, rather than clumsily longer, than the old. I'd prefer some neologism (even Greek, Latin, or German!) as a title over the current bloviation. --Lexein (talk) 13:28, 6 February 2013 (UTC)
- Query. Are all "cognitive biases" biases that involve judgment and (and/or?) decision[hyphen]making..? CsDix (talk) 18:04, 12 February 2013 (UTC)
- According to the cognitive bias article, "a cognitive bias is a pattern of deviation in judgment". If that is correct, then I suppose the answer is yes. However, I notice that the list in this article includes some biases that do not seem to necessarily be about either judgments or decision-making. The list includes "belief and behavioral biases", "socal biases" and "memory errors and biases". The descriptions of many of the phenomena are not really a matter of judgments or decision-making (e.g. the curse of knowledge, bizarreness effect, and childhood amnesia) – especially the items under "memory errors and biases". —BarrelProof (talk) 20:19, 12 February 2013 (UTC)
- Lexein, could you please name the "two highly biased editors" who used "aggressive and vocal tag-teaming and shouting down"? Maybe it would help if you link to the diffs or talk page section where this happened.
- Also to Lexein: "We don't rename things because a few books or articles use a new fad name, we wait until the majority of the field settles on a new name" You are absolutely right. This is why we should not be using the term "cognitive bias" to cover all these biases when the textbooks don't. What literature are you working from?
- CsDix: the answer is yes. That means that some entries have to move out of this list, but that's pretty inevitable as the list is basically a grab-bag of different things.
- To BarrelProof's "false consensus" accusation: Before this requested move, we had me proposing, User:Fainities supporting, User:Morton Shumway supporting, User:Dreadfullyboring "leaning towards renaming or splitting", User:Lexein "concerned". Not unanimity, but only I and Morton Shumway have so far actually cited literature in support of our arguments. I could of course speculate about biases that are shaping your judgment, but it would be rude and not advance the discussion, so I won't.
- BarrelProof: "the claim that it is WP:NPOV does not seem correct to me". Interesting and I hope you're right. Can you spell out the argument?
- Problem for BarrelProof and Lexein: where in reliable academic literature is there a list of cognitive biases on which this article can be based? Conversely, is the literature on biases in judgement and decision-making? Concrete example: does Hindsight bias belong in an NPOV list of cognitive biases? Does it belong in a list of biases in judgment and decision-making? MartinPoulter (talk) 00:05, 13 February 2013 (UTC)
- Martin, I hope you understand that my references to the 'false consensus effect' and 'irrational escalation' were primarily meant to be humorous. I just thought it would be fun to use topics from the article itself in the renaming discussion. I'm not sure I understand some of your new comments. The reason that I don't see an NPOV problem in calling this a list of cognitive biases is just that I don't see any particular opinion or perspective being promoted by using that name. To me, yes, hindsight bias does seems like a cognitive bias. I'm not sure whether it really fits as a bias in judgment and decision-making or not, although I think it probably does. But I don't understand why you asked about that. To me, the term 'cognitive bias' just refers to biases related to cognition in general, which seems broader in scope than biases in judgment and decision-making. —BarrelProof (talk) 01:25, 13 February 2013 (UTC)
- Support - This is a silly terminology quibble. Roger (talk) 03:22, 13 February 2013 (UTC)
- Oppose - Hi all. I accept the common name argument, however, I also think that concerns over the use of the term “bias” in psychology are valid and not marginal.[1][2][3][4] I would therefore like to suggest a third option. This would be to merge this list with list of psychological effects (retaining that latter name). That list could then be broken down into sub-lists according to discipline (e.g. social psychology, cognitive psychology, clinical psychology). While I know that a taxonomy along those lines will be far from perfect, at this stage I think that it will be far less fraught than trying to distinguish between “biases” and “heuristics” or between “decision making” and “social biases”. I also feel like this will be a list that will adequately serve the purposes of a list of this nature. Of course, I am keen to hear what others think. Cheers Andrew (talk) 03:51, 14 February 2013 (UTC)
- Support "List of cognitive biasses" is a better name than "List of biases in judgment and decision making ", and I fail to see the controversy. --Spannerjam (talk) 05:53, 14 February 2013 (UTC)
References
References
- ^ McGarty, C. (1999). Categorization in social psychology. Sage Publications: London, Thousand Oaks, New Delhi.
- ^ Turner, J. C.; Reynolds, K. H. (2001). Brown, R.; Gaertner, S. L. (eds.). "The Social Identity Perspective in Intergroup Relations: Theories, Themes, and Controversies". Blackwell Handbook of Social Psychology: Intergroup processes. 3 (1): 133–152.
- ^ Oakes, P. (2001). The root of all evil in intergroup relations? Unearthing the categorization process. Blackwell handbook of social psychology: Intergroup processes, 4, 3-21.
- ^ Turner, J. C.; Oakes, P. J. (1997). McGarty, C.; Haslam, S. A. (eds.). "The socially structured mind". The message of social psychology. Cambridge, MA: Blackwell: 355–373.
- The above discussion is preserved as an archive of a requested move. Please do not modify it. Subsequent comments should be made in a new section on this talk page or in a move review. No further edits should be made to this section.
Baader-Meinhof Phenomenon
There used to be a separate page for the Baader-Meinhof Phenomenon. What happened to it? --David G (talk) 16:38, 14 May 2013 (UTC)
Identified victim bias
I added this text, which was reverted by U3964057 (talk · contribs): "* Identified Victim Bias - the tendency to respond more to a single identified person at risk than to a large group of people at risk."
My citation was to http://www.law.harvard.edu/programs/petrie-flom/events/conferences/statistical/statistical.html, and the reversion noted that "A 2012 conference website is not sufficient substantiation that this is a commonly accepted 'cognitive bias'".
So my question is: What would be considered "sufficient"? This? http://pluto.mscc.huji.ac.il/~msiritov/KogutRitovIdentified.pdf Thanks, Vectro (talk) 13:23, 11 August 2013 (UTC)
- Hi Vectro. As per the Wikipedia guidelines around reliable sources there is no across the board principal that can be applied in terms of what is and is not a sufficient source. That is, it is "context dependent". In this case I would say that the journal article your offered will do as an appropriate source for the notability of the effect you added to the list (although a review article, or other tertiary source, would be better). I would say then by all means add in the 'identified victim effect' with the reference you suggest. Other editors are of course welcome to chime in with their thoughts. Cheers Andrew (talk) 13:31, 12 August 2013 (UTC)
- OK, since there are no (other) objections, I will re-add with that source. Vectro (talk) 03:28, 15 August 2013 (UTC)
Explaining base rate fallacy
Paging @Spannerjam: about this edit. "the tendency to base judgments on specifics, ignoring general statistical information", while not perfect, seems to require less prior knowledge than " the tendency to ignore base rate probabilities", which includes two technical terms. Someone who doesn't know what the "base rate" is won't be helped by looking at that definition. MartinPoulter (talk) 10:28, 2 September 2013 (UTC)
Misattribution as a cognitive bias?
Hi all. Spannerjam felt that my removal of “source confusion” was unjustified and reinserted the item with an improved definition. Despite the improvement (which I do like) I still don’t think source confusion belongs in the list. To elaborate upon my edit summary, I have always come across source confusion and memory misattribution as potential ‘’outcomes’’ of some cognitive effect (or “bias”); not as a cognitive effects unto themselves. For example, in the classic “who said what” paradigm a salient and inclusive social category is shown to lead to source confusion as to which category member said what phrase.
To include features of human cognition like source confusion really broadens the gambit of the list beyond cognitive effects and decision making heuristics. In the long run I think we could expect a much longer list and an eventual rename to something like list of perceptual and memory phenomenon. Anyway, I am keen to hear others’ thoughts. Cheers Andrew (talk) 14:15, 6 November 2013 (UTC)
Should the Abilene paradox be included?
Personally, I think it fits. Luis Dantas (talk) 18:40, 21 December 2013 (UTC)
- Hi Luis Dantas. I would suggest that the the Abilene paradox does not really qualify as a cognitive bias. I would instead say that the Abilene paradox describes a particularly type of outcome that my arise as conformity processes operate within a particular context. It is these conformity processes that one might think of as being subject to cognitive biases; not the Abilene paradox itself. Does that resonate with you? It might be illustrative to point out that pluralistic ignorance also does not make the list. Cheers Andrew (talk) 12:49, 24 December 2013 (UTC)
Ludic fallacy
I looked at the page for the Ludic fallacy, and I found it extremely wanting. I think that this is a term coined by Taleb and as far as I can tell it is only used in the popular press. While it is nominally about the "misuse of games", that is itself a bit of hyperbole, and the "bias" Taleb identifies is that people are using inductive reasoning instead of... something else? I'm not seeing any identifiable bias here. I think it should be removed from this list. 0x0077BE (talk) 00:06, 12 February 2014 (UTC)
- Hi 0x0077BE. I agree. It seems to have dubious notability credentials and even more dubious credentials as one of the "decision-making, belief, and behavioral biases". I am going to go ahead with the removal. Cheers Andrew (talk) 02:57, 12 February 2014 (UTC)
difference in the way Bizarreness effect is written
I see that "Bizarreness effect" is entered in a different format than the other names in the lists, without explanation. Please provide the reason; I am curious.
Examples:
'Bizarreness effect'
Choice-supportive bias
Change bias
Childhood amnesia
Conservatism or Regressive Bias
Thank you, Wordreader (talk) 15:53, 24 January 2014 (UTC)
- Looking good, Dream Eater. Thank you, Wordreader (talk) 01:40, 15 March 2014 (UTC)
Frequency illusion and the Baader-Meinhof phenomenon
The page Baader-Meinhof phenomenon has been deleted, and there is no page for Frequency Illusion. I'd like to include in this page that "Frequency Illusion" is also called the Baader-Meinhof phenomenon, and make both of those pages redirect to this article. Does that seem reasonable? Havensfire (talk) 21:24, 19 April 2014 (UTC)
Secrecy Bias
Political scientists have identified a bias towards assigning more weight to information one believes to be secret. Would this be an appropriate addition to this list?
A source:
Travers, Mark, et. al. The Secrecy Heuristic: Inferring Quality from Secrecy in Foreign Policy Contexts, Political Psychology, Volume 35, Issue 1, pages 97–111, February 2014
Scu83 (talk) 22:32, 12 June 2014 (UTC)
- Hi Scu83. I would prefer not to include it at this stage on the grounds of 'suspect notability'. In other words, I would like to wait to see if the concept is adopted by the broader scientific community. This is because getting an article through peer review doesn't equate with scientific consensus. Others may disagree, but I think this wiki-article is more useful as a list of widely accepted cognitive biases rather than as a list of all the cognitive biases that anyone has ever thought up and got published. Keen to hear what others think though. Cheers Andrew (talk) 12:09, 13 June 2014 (UTC)
Pandering Bias
There may be a better name for this, "Pandering Bias" is the best I could come up with. I haven't found a specific reference to this, and I am hoping someone reading this has seen one.
But I definitely find - both in my personal experience (anecdotal, of course), and through history in general (admittedly not scientific) - that anything complimentary of the human species, will be accepted without question, while traits that are negative will be rejected.
For example, I find that rational choices are rarely made by human beings in their personal life. In fact, the Biases page itself is a huge list of examples of this. But almost everyone feels that they usually make rational choices, and only rarely make instinctive ones.
And an example of the converse - in 1973, Jane Goodall and Richard Wrangham discovered - to their dismay - strong evidence that many primate species practice warfare, thus making it a hard-wired instinct, rather than a cultural practice that might be halted. However, that viewpoint is rejected by everyone who comes to hear about it (confirmation bias ?), with the possible exception of scientists who have examined the research.
Has anyone seen this overall "pandering" phenomenon mentioned elsewhere ? I obviously can't put it in the list without more support and references. 76.209.223.103 (talk) 23:38, 12 August 2014 (UTC)
Just-World Hypothesis
Spartan26 asked in 2011 here the question: "Is the "Just-world phenomenon" (Social Biases) the same as the "Just-world hypothesis" (Biases in Probability and Belief)? If so, should they be combined? If not, why are they different? Should they reference each other?" In response I would point to this article on the just-word phenomenon on About.com that ends with "Also Known As: Just-world theory, just-world hypothesis, just-world fallacy or just-world effect." This suggests that these are equivalent. Kind regards, Timelezz (talk) 16:00, 16 October 2014 (UTC)
- Hi Timelezz. I agree that it is not necessary to have two entries. I have made the removal and retained the one that i think had better language. I have also opted for placement in the 'social biases' list, but I am not wedded to this decision. Cheers Andrew (talk) 22:12, 16 October 2014 (UTC)
Blaming the victim
I was suprised to find this common cognitive bias omitted, despite having it's own wikipedia article. Anniepoo (talk) 02:51, 7 April 2015 (UTC)
- Hi Anniepoo. To my knowledge victim blaming is not a cognitive bias in the sense covered by this article. For the most part it is too multiply-determined a phenomenon. Instead, more specific cognitive biases may reportedly contribute to victim blaming (e.g. ingroup favoritism, just world beliefs). Does that make sense to you? Cheers Andrew (talk) 03:25, 10 April 2015 (UTC)
- Makes sense to me. Thre's a link to Cognitie Bias at the end of Victim blaming article. Is that inconsistent? Anniepoo (talk) 05:22, 10 April 2015 (UTC)
- I think that should be fine. The "See also" links are usually intended along the lines of "if you're interested in this article, you may be interested in these other related articles as well," focusing on topics that were not already mentioned in the main text of the article. The list at Victim blaming is longer than usual, so it might be useful to find ways to move some of them to the text. Sunrise (talk) 06:55, 10 April 2015 (UTC)
- Hi all. I actually think the 'see also' mention over there could be misleading. I am going to be bold and remove for the meantime. After all, we already have the link in there to just world theory. Cheers Andrew (talk) 11:13, 16 April 2015 (UTC)
- I think that should be fine. The "See also" links are usually intended along the lines of "if you're interested in this article, you may be interested in these other related articles as well," focusing on topics that were not already mentioned in the main text of the article. The list at Victim blaming is longer than usual, so it might be useful to find ways to move some of them to the text. Sunrise (talk) 06:55, 10 April 2015 (UTC)
- Makes sense to me. Thre's a link to Cognitie Bias at the end of Victim blaming article. Is that inconsistent? Anniepoo (talk) 05:22, 10 April 2015 (UTC)
Duplicate {{vanchor}}s
There are a number of duplicate {{vanchor}}s on this page (e.g. Regressive bias). They each need to be unique, if links to any but the first are to work. One fix woud be to add some or all of the section header in parentheses (e.g. Regressive bias (memory) and Regressive bias (belief) ).
Comments? Other possible fixes? — Lentower (talk) 03:18, 8 June 2015 (UTC)
Another List bias
I heard this on NPR this morning, that items in a list are more cited in research if they're at the top of a list, vs others that are lower in that same list. It may require an article, but I thought I'd put the link here, if anyone wishes to tackle it: [1] Hires an editor (talk) 00:23, 16 July 2015 (UTC)
Assessment comment
The comment(s) below were originally left at Talk:List of cognitive biases/Comments, and are posted here for posterity. Following several discussions in past years, these subpages are now deprecated. The comments may be irrelevant or outdated; if so, please feel free to remove this section.
Ludic fallacy probably does not belong to this list, as it is not commonly accepted as a cognitive bias. It is used almost exclusively by its inventor Nassim Taleb. I do not believe there has been any published psychology work on "Ludic fallacy." Drhoenikker 16:30, 27 April 2007 (UTC) |
Last edited at 16:30, 27 April 2007 (UTC). Substituted at 15:16, 1 May 2016 (UTC)
Notable ommission
There is a bias where a thief will see everyone else as thieves, or where someone who cheats will accuse everyone else of cheating, without realizing it. What is that bias called? — Preceding unsigned comment added by 41.164.178.68 (talk) 07:23, 9 October 2015 (UTC)
I think the "thief considers everyone else a thief" phenomenon may fall under the heading of Psychological projectionPsychological Projection. I personally would classify that as a cognitive bias, but experts may say otherwise. — Preceding unsigned comment added by 75.143.172.192 (talk) 01:16, 22 October 2015 (UTC)
We are more likely to believe a statement if it is easy to understand.
This two-part article gives a pretty good overview of the bias (if you can overlook its political leaning) and summarises it like this:
"The less effort it takes to process a factual claim, the more accurate it seems."
Is this bias listed on the page? If not, could we add it? JoeyTwiddle (talk) 11:17, 19 January 2016 (UTC)
- This bias seems especially interesting at present, with the rise of meme (image) culture. It might be related to System 1, but I find the concise summary above quite valuable, both for people hoping to influence others, and for those seeking to avoid being unduly influenced! — Preceding unsigned comment added by 210.195.120.5 (talk) 07:15, 25 January 2016 (UTC)
- What is needed is a citation to a name for such a bias. A name is not mentioned in the article, nor the study mentioned. To make up one would be original research. Ultimately, acting on a cognitive bias could be considered lazy thinking to start with. If they put some effort in to thinking about it, they would see the bias. That said, there is processing fluency, attribute substitution and law of triviality. The book Consumer Behavior mentions "low-effort decision-making processes". Richard-of-Earth (talk) 22:09, 25 January 2016 (UTC)
Declinism
I saw declinism added to the list by User:Kvng and I don't think it should be there, but I thought it best to discuss here before removal. The article on Declinism describes it as a belief. Something can't be both a belief and a cognitive bias. Looking through a Google Scholar search, there are lots of mentions of declinism, but as a belief or a feature of schools of thought: there doesn't seem to be anything about it in the extensive literature on cognitive biases. The reference that is seemingly used to justify calling declinism a cognitive bias is a newspaper opinion column by psychologist Pete Etchell. It's written by a mainstream scientist, but isn't part of peer-reviewed scholarly literature, is openly speculative rather than sharing definitive results, and it attempts to explain declinism as a result of cognitive biases, which is arguably an important difference from saying declinism is itself a cognitive bias. All sorts of beliefs or preferences might be explained in terms of cognitive biases: that's partly why the topic is so interesting. It doesn't mean, though, that those beliefs or preferences all belong in this list. MartinPoulter (talk) 20:32, 16 February 2016 (UTC)
- Declinism suggests that cognitive bias may fuel declinist sentiment. Declinism is definitely a belief but there is a close relationship between cognitive bias and beliefs. I don't agree that something can't be both a belief and a bias. I have added another reference to Declinism but it is along the same lines as the first: reliable source and scientist but not scholarly. I don't outright reject removal of Declinism from this list but I would like to see a third opinion before we make any changes. ~Kvng (talk) 16:11, 17 February 2016 (UTC)
Antiquity illusion
Opposite of recency illusion. The pre-dating of one's memories about when a cultural practice, word etc. was first used.Languagelog: The antiquity illusion --92.214.158.2 (talk) 15:42, 18 April 2016 (UTC)
projection bias
Should be moved from the "social" category into one more appropriate for intertemporal choice stuff. Its closest relative is empathy gap−−Ihaveacatonmydesk (talk) 19:54, 22 May 2016 (UTC)
- Done
Illusion of validity
This description is wrong, illusion of validity is akin to overconfidence (it basically is the same, imo the two should be merged). See page for sources. The description used is the one for Information bias (psychology). Cheers −−Ihaveacatonmydesk (talk) 19:59, 22 May 2016 (UTC)
- @Ihaveacatonmydesk:, per WP:BOLD, go ahead and fix description(s)! Baking Soda (talk) 10:40, 25 May 2016 (UTC)
How is this a memory bias? Ihaveacatonmydesk (talk) 23:45, 29 May 2016 (UTC)
on "frequency illusion" being called the Baader-Meinhof Phenomenon
I'm unsure if this is an appropriate reference, but it does discuss the origin of the phrase. - Paul2520 (talk) 22:35, 25 August 2016 (UTC)
"Cheat sheet" proposes big deduplication and regrouping
This blog post claims to be the result of an effort to deduplicate and regroup this article.84.214.220.135 (talk) 12:03, 3 September 2016 (UTC)
+1 to that, I just came here and saw it was already mentioned. Samois98 16:32, 5 September 2016 (UTC) — Preceding unsigned comment added by Samois98 (talk • contribs)
- I've contacted the illustrator mentioned at the bottom of the blog and he's willing to upload a CC-licensed jpeg. I think it would make a good lead/hero image for this article. ~Kvng (talk) 22:05, 12 September 2016 (UTC)
- If I start seeing hero images on Wikipedia articles, I'll start shooting puppies. The illustration is great, though. Paradoctor (talk) 03:04, 4 November 2016 (UTC)
- For the sake of the puppies', please don't download the Wikipedia mobile app. ~Kvng (talk) 14:13, 6 November 2016 (UTC)
- I agree with Kvng. (But then, I agree with everybody about everything! You're all right!) Herostratus (talk) 17:43, 6 November 2016 (UTC)
Generating flashcards from article
I just thought I'd mention that I've written a short Python script that dumps the list into a .csv file ready to import into various spaced-repetition software (such as Anki and Mnemosyme). As I've started using this data I've found it to work pretty well but some descriptions could use some improvement. If you want to learn these in a reasonable manner and help out with writing better descriptions for some of these, check out the script! I expect to make such improvements myself over time but I currently almost exclusively do the cards when not at a computer, which prevents me from improving descriptions when I find ones in need of improvement. Hopefully this'll be of some use to someone! I can really recommend using flashcards to memorize these if someone is on the fence about this. You can find the script here: https://github.com/ErikBjare/biases2csv – Erik.Bjareholt (talk) 18:10, 17 March 2017 (UTC)
- May I suggest Wikiversity? If you put this up as a resource or course there, including an external link per WP:ELMAYBE would certainly be justifiable. Paradoctor (talk) 20:07, 17 March 2017 (UTC)
Great work, editors
This is so useful. Well done, whoever has contributed. :-). Tony (talk) 05:38, 23 June 2017 (UTC)
Learned Helplessness
I feel like Learned Helplessness should be somewhere on this list, but I don't know enough about the subject to be comfortable adding it. Tristyn⌘ 18:17, 26 August 2017 (UTC)
- A cognitive bias may be involved in the phenomenon, but learned helplessness itself is not a cognitive bias. Actually, it is rational not to expend effort that can reasonably be expected to be wasted. Paradoctor (talk) 22:53, 26 August 2017 (UTC)
Anthropocentrism
@Gaia Abundance Life: This is a list class article. Notice how each point only includes a very short summary with a link to the relevant article. I suggest adding your material to Anthropocentrism instead and to only add a single sentence here if necessary. Thanks, —PaleoNeonate – 19:54, 26 October 2017 (UTC)
List Length Effect Link Suggestion
Currently the List Length Effect entry of the list has a note for Further Explanation Needed. It seems like https://en.wikipedia.org/wiki/Free_recall is related to List Length Effect. Since List Length Effect doesn't redirect to another larger entry I suggest that List Length Effect should be linked to https://en.wikipedia.org/wiki/Free_recall until a complete article on List Length Effect is made.
External links modified
Hello fellow Wikipedians,
I have just modified one external link on List of cognitive biases. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
- Added archive https://web.archive.org/web/20090117084058/http://www.usc.edu/projects/matherlab/pdfs/Matheretal2000.pdf to http://www.usc.edu/projects/matherlab/pdfs/Matheretal2000.pdf
When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.
- If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
- If you found an error with any archives or the URLs themselves, you can fix them with this tool.
Cheers.—InternetArchiveBot (Report bug) 10:33, 29 December 2017 (UTC)
- Site's not dead, just moved without leaving a redirect. Paradoctor (talk) 14:31, 29 December 2017 (UTC)
In group cognitive bias?
I recently added in-group (favoring members of ones own group) under cognitive bias, and it was reverted as already covered in social biases, I.e. give preferential treatment to members of their group.
But I think in addition to social unequal treatment effect there is a cognitive bias block somewhat related to reactive devaluation or confirmation bias — that it is easier to attend and grant weight to logic and testimony from members of a group one is a member of and to accept reasoning that fits within the dogma of the group. And difficult to credit or even think of things unrelated to the group, even if they are not in opposition. Just by lack of familiarity and lack of motivation if nothing else. e.g. A random person might lack belief or value an answer in trigonometry.
Thoughts? Should I consider it a cognitive as well as social? Is there a separate label? Is it a part of confirmation or reactive bias? Markbassett (talk) 01:07, 16 April 2018 (UTC)
Reader's experience and organization suggestions
A frequent reader of this page wrote an article about his experience of it and developed a Cognitive Bias cheat sheet. His experience/critique:
the Wikipedia page is a bit of a tangled mess. Despite trying to absorb the information of this page many times over the years, very little of it seems to stick. I often scan it and feel like I’m not able to find the bias I’m looking for, and then quickly forget what I’ve learned. I think this has to do with how the page has organically evolved over the years. Today, it groups 175 biases into vague categories (decision-making biases, social biases, memory errors, etc) that don’t really feel mutually exclusive to me, and then lists them alphabetically within categories. There are duplicates a-plenty, and many similar biases with different names, scattered willy-nilly.
He further describes his editing process in organizing and trimming this page into his cheat sheet, which might be helpful in organizing and editing this page. Daask (talk) 13:46, 19 May 2018 (UTC)
- As always, WP:WIP, so we can use all the help we can get. A word to the wise, though. My impression is that the literature does not present an orderly consensus categorization scheme. "Organizing" the list could thus easily drift into WP:OR territory, as any categorization or placement into this or that section is a statement about the items placed there. Special attention should be given to source criticism. Paradoctor (talk) 16:13, 19 May 2018 (UTC)
I couldn't find it. Maybe it's hiding under some other name, but since there's a wikipedia page for Fallacy of composition, I think that key word should be in whatever else it was named as. And I didn't add it, because I'm not confident enough of my skill in classifying it (or the possibility it's already in there somewhere). — Preceding unsigned comment added by 2601:645:4001:8469:4D10:CC67:C8AC:F214 (talk) 00:42, 4 July 2018 (UTC)
- Thank you. Added together with fallacy of division in the entry illicit transference. Paradoctor (talk) 01:15, 4 July 2018 (UTC)
WP:OR - Diagram
- The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
Looking at the diagram that currently appears at the top of this article;
It strikes me that there are some pretty serious WP:OR concerns with that chart. Does anyone else feel the same? NickCT (talk) 21:04, 19 September 2018 (UTC)
- Yes. OTOH, one the image do's is "Try to find at least one image for each article." This certainly does not override WP:OR, but I think, at the very least, the image gives a rather visceral impression of how many biases there are, which does "increase readers' understanding of the article's subject matter". How about clarifying to readers that the classification in the diagram is not based on WP:RS, and should be seen "merely" as a handy overview? We could add a section discussing the diagram. There are doubtlessly many other attempts at organizing, categorizing and classifying the cognitive biases, but I'm not aware of any as comprehensive, which makes it a unique resource. The diagram might even be notable in its own right, the HuffPost piece is likely not the only coverage it got. The OR concerns definitely need to be adressed, but I think we'd not be doing ourselves a favor by simply dropping the diagram. Paradoctor (talk) 02:51, 20 September 2018 (UTC)
- @Paradoctor: - re How about clarifying to readers that the classification in the diagram is not based on WP:RS, and should be seen "merely" as a handy overview? - That strikes me as a potetntially reasonable approach. Do you want to take a shot at doing that? NickCT (talk) 14:31, 20 September 2018 (UTC)
- Does this work for us? Paradoctor (talk) 06:21, 22 September 2018 (UTC)
- @Paradoctor: - That looks good. I might try to make it a little less verbose. Maybe "A nonscientific illustration of 180 cognitive biases categorized into a two-tier heirarchy."? NickCT (talk) 14:32, 25 September 2018 (UTC)
- Does this work for us? Paradoctor (talk) 06:21, 22 September 2018 (UTC)
- (No need to ping me, I watch where I talk.)
- "looks good" Thanks.
- "Non-scientific" is pejorative, and does not really do Benson's effort justice. Also, the reader could easily get the impression that there is a consensus, and that this "nonscientific" diagram merely ignores it.
- "categorized into a two-tier heirarchy" It should be "organized into", categories are what hierarchies consist of.
- How about
- "An interactive diagram of about 180 cognitive biases organized into a two-tier hierarchy. It should be noted that there is no scientific consensus on how to categorize the biases, and that this particular proposal is not based on peer-reviewed research."?
- Paradoctor (talk) 15:02, 25 September 2018 (UTC)
- I don't really think of "non-scientific" as being pejorative. It just means something which is not based on scientific method.
- After trying a Philly cheese steak last night, I decided it was a superior sandwich to a meatball sub. If you called that conclusion "non-scientific" would you mean it in a pejorative sense?
- Regardless, the more I look at this, the more I think we ought to just scrap the graphic. Ignoring the fact that it's WP:OR, it also just doesn't seem to make sense. Looking at how some of the categorizations that are made, it's not clear on what basis biases are categorized. Some of it seems counter-intuitive. I'm worried that as well as not being well sourced, this content may just be plain misleading and/or confusing. NickCT (talk) 17:03, 25 September 2018 (UTC)
- Well, you know where the edit button is. Paradoctor (talk) 17:29, 25 September 2018 (UTC)
- Well! I like to know folks agree with me before editing.
- I take your earlier point re "Image dos", and agree that it would be good if we had something here. That said it strikes me that no image may be better than an WP:OR and mis-leading image... NickCT (talk) 17:35, 25 September 2018 (UTC)
- Well, you know where the edit button is. Paradoctor (talk) 17:29, 25 September 2018 (UTC)
- I'm not getting you. You raise an issue, I make a proposal to resolve it, to which you reply "That looks good", with the caveat that you'd prefer a less verbose caption. At this point, one could be forgiven to think that the remaining issue is fine-tuning the caption. So I try to adress that, to which your reaction suddenly is to say you want the whole thing gone. What the?
- Concerning WP:OR, the whole point of recaptioning was to frame the diagram as an illustration, not as a statement representative of the state of the scientific discussion. One could of course insist that images do that, but then we would, e. g., have to remove all images from bread, because not one of them is taken from peer-reviewed publications. Paradoctor (talk) 20:34, 25 September 2018 (UTC)
- It's an excellent, useful diagram. To echo Paradoctor we'd not be doing ourselves a favour by simply dropping it. The current caption:
"More than 180 cognitive biases organized into a two-tier hierarchy. It should be noted that there is no scientific consensus on how to categorize the biases, and that this particular proposal is not based on peer-reviewed research. This diagram should be considered merely illustrative, rather than authoritative. The individual biases are linked to their correspponding Wikipedia entries."
is more than adequate (spelling error aside) to address WP:OR concerns. Re verbosity, I don't think it is overly so. Maybe remove "particular" and replace "should be considered merely" with "is", and substitute "and not" for "rather than" to be more definite. Otherwise it's fine. So we would have:
"More than 180 cognitive biases organized into a two-tier hierarchy. It should be noted that there is no scientific consensus on how to categorize the biases, and that this proposal is not based on peer-reviewed research. This diagram is illustrative and not authoritative. The individual biases are linked to their corresponding Wikipedia entries."
Captainllama (talk) 12:20, 26 September 2018 (UTC)
- @Paradoctor: - Apologies for waffling. I made my initial comments without having looked closely at the graphic. After a close examination, my opinion changed.
- I think you made the critical point when you quoted "increase readers' understanding of the article's subject matter". If the graphic is so poorly constructed that it won't increase anyone's understanding, why include it? It would be one thing were it a helpful graphic.
- And I think you know that WP:OR doesn't exclude everything that's not from "peer-reviewed" publications. It excludes stuff from rando's on the internet, who are usually looking to post shenanigans. NickCT (talk) 12:30, 26 September 2018 (UTC)
- "made my initial comments without having looked closely" No problem. It would have been helpful to know that when it happened, though.
- "WP:OR doesn't exclude everything that's not from "peer-reviewed" publications" I thought that was what I said?
- "If the graphic is so poorly constructed" By what standard? As repeatedly stated, there is currently no scientific consensus on categorization. If anybody knows otherwise, please let us know. In the meantime, this diagram can be considered a best current practice effort, or at least a workaround. If I interpreted the other's comments correctly, I'm not alone with this view. Paradoctor (talk) 17:54, 26 September 2018 (UTC)
- By any logical standard. If I'd created the categories "red", "green" and "blue" and randomly sorted the listed biases into those categories, it would make more sense and be more helpful than the way the current image arranges them. NickCT (talk) 12:12, 27 September 2018 (UTC)
- So you don't like Benson's categories. Fair enough. I do, and at least two other editors seem to be fine with it, too. I don't see much to be gained from further discussion on the basis of personal preference.
- Per your "I like to know folks agree with me before editing": The current WP:LOCALCONSENSUS is on keeping the diagram. If that is not satisfactory to you, WP:DR would be your next stop. Did that myself a couple of times, YMMV. Paradoctor (talk) 16:05, 27 September 2018 (UTC)
- Sounds good. How about an RfC? Would you like to see a draft before I post? NickCT (talk) 18:13, 27 September 2018 (UTC)
- "To make them easier to thing about", it says in the diagram. Shame. —Prhartcom♥ 03:37, 28 September 2018 (UTC)
Should this diagram be kept or removed?
- The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
Should this diagram be Kept or Removed?
There's been a conversation (see section above) over whether the diagram appearing at the top of this page should be kept or removed.
Please tell us what you think below!
Thanks in advance for your input. NickCT (talk) 12:59, 28 September 2018 (UTC)
- Remove - As nom. The diagram is aesthetically pleasing but a pretty clear example of original research. Its author and a couple supporters seem to be trying to promote it on Wikipedia. It doesn't seem to aid in the understanding of the subject at hand. NickCT (talk) 13:00, 28 September 2018 (UTC)
- Remove diagram from List of cognitive biases and add
{{Commons category|Cognitive biases}}
at the appropriate location in the list. Biogeographist (talk) 16:02, 28 September 2018 (UTC) - Remove per previous comments. OR and don’t see how it improves article. †Basilosauridae❯❯❯Talk 05:05, 29 September 2018 (UTC)
- Remove I don't much mind, which probably means that we can do without. JonRichfield (talk) 12:19, 2 October 2018 (UTC)
- Remove per its own caption: There is no scientific consensus on how to categorize the biases. Reading it is very enjoyable, like watching professional wrestling, but neither are real. (I am not watching this page, so please ping me if you want my attention.) wumbolo ^^^ 14:54, 2 October 2018 (UTC)
- Remove - it's a beautifully presented diagram, and someone has obviously invested a lot of effort in making it. However, Paradoctor's comment above, the classification in the diagram is not based on WP:RS convinces me that we need to remove it. If it isn't based on reliable sources, then it must either be based on unreliable sources, original research, or synthesis of sources - whichever it is, it should be removed. GirthSummit (blether) 16:55, 2 October 2018 (UTC)
- Remove as per above RhinosF1 (talk) 05:35, 4 October 2018 (UTC)
- On the fence - It is brilliantly beautiful, a pity that we can't present all sorts of subject matter in this way. WP has pages (and more pages) of b-o-o-o-r-i-n-g boring lists; this format is much more interesting. If the creator is notable enough to merit a WP article, or if there is a WP topic on different ways to organize and display information, either would be an easy yes. But I just dropped in here while looking up RfC re a different WP project that has stalled. GeeBee60 (talk) 15:14, 5 October 2018 (UTC)
- The author of the work does not have his own Wikipedia entry. That wouldn't be enough to warrant inclusion though - the categories he's put these biases in are entirely his own work. Even if he was notable as an individual however, that wouldn't be enough - for his blog to be considered a reliable source, he would need to be a very reputable subject matter expert.GirthSummit (blether) 15:34, 5 October 2018 (UTC)
- It might be an example of artwork to include on "his" page, but at any rate this is hypothetical and a distraction -- sorry. GeeBee60 (talk) 12:26, 6 October 2018 (UTC)
- The author of the work does not have his own Wikipedia entry. That wouldn't be enough to warrant inclusion though - the categories he's put these biases in are entirely his own work. Even if he was notable as an individual however, that wouldn't be enough - for his blog to be considered a reliable source, he would need to be a very reputable subject matter expert.GirthSummit (blether) 15:34, 5 October 2018 (UTC)
- Remove as dubious original research. the problem with nice pictures is that they introduce cognitive bias :-). Any such classification must have a source and caveats: there are many ways to classify biases. Not to say that some of them may fall into several categories. Staszek Lem (talk) 20:18, 10 October 2018 (UTC)
- "Nice pictures produce cognitive bias" - Now that's meta. NickCT (talk) 12:26, 11 October 2018 (UTC)
- INCLUDE - Best choice on offer. This seems illustrative, and no binning seems canonical so no objection to this as being contrary to scientific consensus. Lacking proposed alternatives this is Best choice (of one) so use it. I discount OR mentions since the WP article is a constructed set of tables not from RS so is equally OR, and tables lack a prominent caveat about that. Basically there seems no single binning scientifically mandated so anything we write or show will be illustrative of tiered binning which is ad hoc everywhere. The figure is just another example. Web arts commonly do similar non-canonical data visualization and vary in presentation, so this is within common practice. It would be nice if a web pub used the diagram (better yet a book or textbook), but this is excellent and lacking some proposed alternative it is best choice on offer. Markbassett (talk) 11:40, 12 October 2018 (UTC)
- The discussion above is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
Threaded discussion
Just to help others who may (like me) be summoned by a bot, here is the link to a blog where Buster Benson describes how he came up with the categorisations used in the diagram. Blogs are occasionally treated as RS if they are written by a subject matter expert; however, according to his personal website, he is a creative writing graduate of Washington University, who works in web development, and is currently writing a book on managing disagreements - he is not a notable subject-matter expert, he's not an academic, his work has never (afaict) been subject to review. We as editors are not competent to judge the quality of his analysis. If someone were to insert text based on Benson's blog, it would be reverted due to non-RS sources; information presented graphically is subject to the same rules. If his book is published, and is widely praised by notable journals in this area, it might then be considered a reliable source - but as it stands, I just can't see any grounds to include this graphic. GirthSummit (blether) 17:36, 2 October 2018 (UTC)
- @Girth Summit: - Well considered thoughts. NickCT (talk) 19:37, 3 October 2018 (UTC)
@GeeBee60: - Agree with your comments re "brilliantly beautiful" and "b-o-o-o-r-i-n-g boring lists". Unfortunately though, WP is a medium to convey accurate and interesting information to folks. Graphics like these can't survive on aesthetic beauty alone. NickCT (talk) 16:50, 5 October 2018 (UTC)
- @NickCT: - The tricky thing is that something like this can't survive on WP, but the aesthetics coupled with the intellectual effort and pretty clear logic that went into it makes it a natural to survive outside of WP. Understanding comes from a regular reorganizing of our perceptions. Is a list A to Z better than chronological or as a Venn diagram or as a blob or a spiral, or is it simply a rote convention? ANY list of cognitive biases is biased and limited and in some way wrong. Should we have the list? Sure, why not. But I strongly doubt that any WP list is accurate, and even a list with a bunch of citations is original research in that the list was researched by an editor who opted to compile and cite it. If the list is published verbatim from the works of a current researcher, how does this compare to publishing a poem in WP? Or if alternately this is a compilation of multiple editors, prove that this isn't hooey. The bigger error in any of these lists is that the accompanying WP articles will be wildly inconsistent, from excellent to terrible. So, despite all the concerns about the original research, I don't know that this list is made smarter or more accurate or more virtuous by removing the idea-mandala that is under discussion. We are six blind monks groping an elephant.
- Look, I'm not actively involved in maintaining this page and should remove myself from decisions. But you posted it and so there you have it, an alternative point of view. Posting here is kind of a relief considering your query, because I am caught up elsewhere in a long-winded WP debate over whether or not to retitle a controversial article. I ended up here via the RfC page, seeing if there was some special rules and limits to posting a RfC request that seems to be lingering forever in some sort of limbo. I'm not illuminated, but at least I'm a bit relieved. Cheers, GeeBee60 (talk) 13:48, 6 October 2018 (UTC)
- GeeBee60 I think that NickCT and I would agree with you that the diagram has artistic merit, and if it was just a question of presentation then we wouldn't be having this discussion. However, the diagram contains information which is not in the list, and which is not sourced to anything other than the creator's blog (and his own original research/hypothesising). He has categorised the contents of the list into different types of bias, and provided commentary on how they affect us - this content can't stand, no matter how nicely it's presented. None of us gets to put our own musings and hypotheses into Wikipedia, no matter how well-presented or thought-through they are - this isn't the place. GirthSummit (blether) 14:01, 6 October 2018 (UTC)
Time for WP:SNOW close? Unless anyone objects, I'm going to try to get an admin to close this RfC within the next 24 hr. NickCT (talk) 18:20, 11 October 2018 (UTC)
I object ... folks seem to have not explored the OR issue fully. There is no article mention of authoritative organization, of authoritative groupings. So the article itself is an OR tables construct ... and nothing canonical exists so nothing else CAN be done. (Well, one single unlabeled table, but the norm seems folks just make their own groupings.... this article did the same). The article TEXT should probably put in some mention of that, or caveat warnings that the tables are not authoritative or complete. The figure at least has caveat on it. Cheers Markbassett (talk) 11:53, 12 October 2018 (UTC) p.s. NickCT ... the illustration not matching WP lists has no alternative either. Unless the wp article is permanently banning people making edits or putting in info there is going to be a mismatch. But there seems nothing sacred about the tables today or mandate an article diagram and article match in detail or limit each other, so... ? Markbassett (talk) 11:58, 12 October 2018 (UTC)
- K Mark. I'll leave it open. Though I disagree on your "not explored OR" point. Your argument basically seems to be that since you see the article itself as OR, then it shouldn't matter that the graphic is OR. Even if the article was OR, which I don't think it is, your argument basically boils down to "two wrongs make a right". NickCT (talk) 13:26, 12 October 2018 (UTC)
- I've just been pulled into this by the RfC. I agree with the comment that the OR issue hasn't been fully discussed, as I would expect this to be a single ungrouped table when there is no agreed grouping. I don't believe however that the problem with the tables means even more OR should be added. I would suggest closing the RfC and marking the page with needs citation for the groupings FMMonty (talk) 12:26, 19 October 2018 (UTC)
WTF all this discussion is about? Our most basic policy WP:V says that unreferenced material may be deleted at any time. The pic is clearly an unreferenced classification of biases. No local discussion can override the policy. The image is gone. Period. Staszek Lem (talk) 17:14, 12 October 2018 (UTC)
- The discussion above is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
Cognitive Bias Codex Diagram, Continued
Hi, I was happy to learn there had been an active discussion on the Talk page about the diagram (yay!) yet no one tagged me (boo!). Thank you everyone for weighing in, especially GIRTH SUMMIT, PARADOCTOR, GEEBEE60, and everyone else who commented and lent their thoughts. Since I can't participate in a discussion now forebodingly marked Please do not modify, I'd like to re-open the discussion by introducing a few facts that seem to have been missed.
1. From the discussion, it seems, at least for some participants, that the author of the diagram was Buster. This is incorrect. Buster wrote a 2016 blog post, about this very WP article, and I (John) then created the diagram, using open source software, based on ideas in Buster's post. There is a somewhat circuitous chain here, viz. the diagram (mine) based on a blog post (his), which is based on an article (this one).
2. re: the WP:V and WP:OR charges leveled above, I think it's worth considering that the content of the diagram is 100% sourced from two places: Wikipedia (literally, from this article), and Buster's work. Someone noted in the discussion that Buster is not an academic, therefore his writing is not a reliable source. FWIW, the post and the diagram received a lot of attention online prior to inclusion here on Wikipedia, and the diagram and post have been syndicated in Huff-Post science and Atlantic Monthly's online arm, "Quartz." Buster's 2016 work is also being published as a book currently being written and documented using an open process, as someone else noted. Diagrams and art on WP help illustrate and illuminate the articles. Is there any disagreement that that's what this diagram does, or did, before it was removed?
3. Aesthetic merits aside, from a notability perspective, the diagram has been well received enough to be voluntarily translated from the original English into French, Arabic, Turkish, Japanese, Russian, and Ukrainian, by unsolicited volunteers. Wikipedians have found it valuable, and have spent their own time, often painstakingly, to translate and partially recreate it repeatedly in their native language for their own language Wikipedias. I did not anticipate this, and while I'm obviously biased, I believe it says something about the value that Wikipedians in particular and internet users in general receive from the diagram.
4. Prior to being included in the article here, the diagram has also been cited in academic journals — researchers have written semi-regularly asking for permission to include or cite it in various ways. Requests for academic citations and use that have come across my desk include journals in Conservation Biology, Advances in Chronic Kidney Disease, and The Avalanche Review, fields about which I know nothing, but whose authors felt their point was bolstered by including the diagram. I'm not sure if these are considered "notable / verifiable / RS etc." by the editors above, but they seem to far exceed the base criteria.
I would be encouraged if the discussion were to continue given the previously absent information given above. If I've misrepresented, misstated, or misconstrued things, please correct me. TIA Jm3 (talk) 08:53, 21 November 2018 (UTC)
- @Jm3: Hi - thanks for contributing to this. I think that what you've described above confirms my concerns about the inclusion of original research.
- You created the diagram, based on a novel synthesis by Buster, published on his blog post. His post may have been based on this article, but he introduced novel elements, such as his categorisations of the biases listed in the article. This is his own original work, which is the very definition of WP:OR, meaning that it's not a straightforward illustration of information presented in the article. If that original work hasn't been published by a reliable, second-party source (his blog is not such a source), then we can't use it.
- If, since you created it, the work has been widely published by reliable sources, that might allow us to use it. These sources would need to be relevant and competent however - Advances in Chronic Kidney Disease is probably a very reliable source for assertions about chronic kidney disease, but I don't see how we could rely on it for assertions about cognitive biases, which would seem to fall more into the realm of psychology. Similarly, I would argue that HuffPost Science is too broad and shallow to be reliable on its own - it's not peer-reviewed by experts in the field.
- Buster's own unpublished book would not qualify as a reliable source - even when it's published, he is not a noted expert in the field (e.g. an academic with a track record of peer-reviewed work on the subject), so it would still be considered a non-expert primary source. We would need it to be reviewed by reliable secondary sources before we could use it.
- FWIW, I agree with you that the diagram is aesthetically pleasing, and illustrated the article well. I would have no qualms about including it, if it were not for the fact that it also contains Buster's novel categorisations which are not a part of the article (and clearly constitute WP:OR).
- If the diagram, or Buster's synthesis upon which it was based, have been reviewed and discussed in secondary sources that are focused on psychology, please could you provide citations here for review? Then we'll have some sources that we could cite for the work, and I would have no problem with including it. GirthSummit (blether) 11:19, 21 November 2018 (UTC)
Replication crisis in psychology
Given the Replication crisis in psychology, how many of the listed cognitive biases have been scientifically reproduced? Maybe a column could be added to the table, showing which of the mentioned biases have been replicated and are undisputed and which ones are still waiting to be replicated. — Preceding unsigned comment added by Amaier (talk • contribs) 09:43, 19 June 2018 (UTC)
- @Amaier: I think the verifiability of each of the cognitive biases is an important thing. If a column were be to be included detailing the replication status of each of the biases, would it make sense to simply included a few citations? —The Editor's Apprentice (Talk•Edits) 04:26, 9 December 2018 (UTC)
- @The Editor's Apprentice: Good idea. That definitely would make sense.
Facts
Would this be relevant here? Benjamin (talk) 13:52, 14 April 2019 (UTC)
- No. It might be of interest to cognitive bias or rationality, though. Paradoctor (talk) 15:21, 14 April 2019 (UTC)
subdivide categories on page?
Right now, most of the categories in this list are quite large, and hard for the casual user to quickly find what they're looking for. Is there any way we could work to subdivide the sections into easier-to-find sub-categories? Yitz (talk) 03:25, 3 March 2019 (UTC)
- From the lead: "there are often controversies about how to classify these biases or how to explain them". Which makes it difficult to categorize beyond broad strokes. Of course, if you, or anybody else, can come up with good sources which subcategorize the existing categories, and which are not contradicted by other good sources, then yes, of course. As it is, identifying biases from instances requires a good deal of familiarity with the lot of them. Paradoctor (talk) 07:51, 3 March 2019 (UTC)
- Are all of these cognitive biases? I added the Pygmalion Effect but don't know if it falls into the category of a cognitive bias necessarily. — Preceding unsigned comment added by 2601:184:417F:53F3:F957:B2C4:701C:2402 (talk) 02:35, 13 June 2019 (UTC)
Cunningham's Law
Cunningham's_Law: "People are quicker to correct a wrong answer than to answer a question." I'm not sure if this a cognitive bias or not, but it kind of feels like one. There is an article about it here: [6] 118.200.76.228 (talk) 06:43, 18 June 2019 (UTC)
Selection bias definition
I think the definition for selection bias is more referring to the Baader-Meinhoff effect than it is selection bias. This is obviously one of the better known biases (at least in statistics) so I'm surprised it's that far off base. Just wanted to throw it out there. I'll let someone far more qualified than me take a look and make an edit if appropriate. Just my two cents.
198.22.92.40 (talk) 20:26, 10 July 2019 (UTC)Ben
Not sure how to use the talk page here, but I totally agree with Ben. Even the main article Selection bias doesn't really mention anything close to the definition on this page.
Egocentric bias is found twice on the page
I was thinking about some of the biases that repeat themselves. For example, Egocentric bias is mentioned twice as it is a memory bias and a social bias. That's fine, but it's a bit confusing as one probably would only expect to read each bias once. Another thing is that there are similar biases with different names. It's probably too big a job to group them for us. But there are grouping ideas out there like for example, MINDSPACE. Or maybe it won't work in a spreadsheet list anyhow as on page 80 they show their map and it's quite complicated: https://www.bi.team/publications/mindspace/ I have sorted both this Wikipedia bias list and their own list in an Excel spreadsheet using MINDSPACE. But it looks more business-oriented than Wikipedia list-oriented in my opinion. JurijFedorov (talk) 09:07, 21 October 2019 (UTC)
Culture dependence
I know that western society culture(s) often equate what is, or may be information/beliefs/behavior as universal human phenomena when in fact, they are culturally-based. What was the "standard IQ test" for example was initially (and likely still somewhat even now) shown to have cultural bias. Have any of these cognitive biases been tested, cross-culturally (or other identities such as gender, age) to see if any of these are cultural specific, or even if there exists cognitive biases in other cultures that aren't found in Western cultures(s)? — Preceding unsigned comment added by 66.41.236.23 (talk) 2019-11-22T20:33:55 (UTC)
- Welcome to Wikipedia. Please understand that this can't be discussed here, per Wikipedia:Talk page guidelines:
- "Talk pages are for discussing the article, not for general conversation about the article's subject (much less other subjects). Keep discussions focused on how to improve the article. If you want to discuss the subject of an article, you can do so at Wikipedia:Reference desk instead."
- HTH, Paradoctor (talk) 20:54, 22 November 2019 (UTC)
I wanna give a barn-star to everybody who's been working on this article's brain map image
You all have made a list of some 210 cognitive biases! Someone has been busy! :) Congratulations! If you can point me to the team / person to congratulate, I'm glad to give a barnstar badge :) SvenAERTS (talk) 03:36, 22 January 2020 (UTC)
Add indicator for level of scientific proof
It seems to me that there are varying levels of scientific rigor represented in these tables. I think it would be good to have a column in the table which indicates the strength of the body of scientific proof for each bias- if there's only speculation of a bias, with maybe one experiment which started the speculation, or if it is based purely on common-sense observations, it'll have label "weak", if there are a few experiments which seem to confirm an effect, it'll be labeled "moderate", and if it has been repeatedly tested and demonstrated, it can be labeled as having strong evidence. I'll try adding some myself, but I don't expect to get them all myself. -Ramzuiv (talk) 17:35, 17 February 2020 (UTC)
- I don't understand what claim this column is meant to rate. Biases are phenomena, not theories. A phenomen may be rare or frequent, but that doesn't relate to whether it is encyclopedic. If sufficient literature discussing it exists, the phenomenon belongs on the list.
- Strength of evidence is about theories explaining phenomena. There may be multiple or no theories for any given bias, and each would come with its own rating. But these ratings are not our decision, that would be WP:OR. If there is relevant literature expressly gauging the strength of the evidence for the various theories explaining this or that bias, and this for more than a handful of entries, then I'd support a "Strength of evidence" column. As long as we don't have that, we can comfortably add it to the description column.
- There is also the question of how much detail we want to add to entries in general, seeing as how almost all of them have articles.
- For those without article, there is likely not enough material for the ratings anyway. Paradoctor (talk) 19:32, 17 February 2020 (UTC)
- I'm not talking about theories which seek to explain biases. At the end of the day, the very question of whether a phenomenon exists is a theory. The statement that a phenomenon exists, is a statement which is either true or false - and we should be wary that someone will see a bias listed here, and based on the fact that they saw it in an encyclopedia, assume that it is a real thing. It is very much our job, as editors, to make sure that this doesn't happen. My understanding is that there are phenomena listed here which have been speculatively discussed in the literature (which by your criteria, merits inclusion), but which haven't actually been scientifically tested to a level of rigour that a person would be well-advised to believe that it is real (which has nothing to do with it being rare or common - a rare bias can be rigorously tested if it is real). I'm uncomfortable with the prospect that such information would exist side-by-side with real, scientifically tested phenomena without anything to distinguish what a reader should, and should not, take for granted (a reader should not be expected to click a link to confirm that something is legit). Perhaps the solution isn't to place a column to indicate that a phenomenon has been rigorously tested and demonstrated. Another solution may be to have a list of speculated biases which is separate from the other lists, to allow a reader to safely assume that the biases listed in the main lists are real things. Or perhaps we could make sure that all the biases listed here are tested, demonstrable biases, and remove any biases that are mainly speculative. - Ramzuiv (talk) 05:45, 19 February 2020 (UTC)
- Ramzuiv it seems to me that your concern is adequately addressed by Paradoctor's comment that "we can comfortably add it to the description column". Cheers! Captainllama (talk) 12:51, 19 February 2020 (UTC)
- I'm not talking about theories which seek to explain biases. At the end of the day, the very question of whether a phenomenon exists is a theory. The statement that a phenomenon exists, is a statement which is either true or false - and we should be wary that someone will see a bias listed here, and based on the fact that they saw it in an encyclopedia, assume that it is a real thing. It is very much our job, as editors, to make sure that this doesn't happen. My understanding is that there are phenomena listed here which have been speculatively discussed in the literature (which by your criteria, merits inclusion), but which haven't actually been scientifically tested to a level of rigour that a person would be well-advised to believe that it is real (which has nothing to do with it being rare or common - a rare bias can be rigorously tested if it is real). I'm uncomfortable with the prospect that such information would exist side-by-side with real, scientifically tested phenomena without anything to distinguish what a reader should, and should not, take for granted (a reader should not be expected to click a link to confirm that something is legit). Perhaps the solution isn't to place a column to indicate that a phenomenon has been rigorously tested and demonstrated. Another solution may be to have a list of speculated biases which is separate from the other lists, to allow a reader to safely assume that the biases listed in the main lists are real things. Or perhaps we could make sure that all the biases listed here are tested, demonstrable biases, and remove any biases that are mainly speculative. - Ramzuiv (talk) 05:45, 19 February 2020 (UTC)
a reader should not be expected to click a link to confirm that something is legit
- So what do you think those little blue numbers in square brackets behind sentences are? ;)
- More to the point, an entry is "legit" when the literature discusses it. If you're worried that readers might make false assumptions about epidemiology, nobody is stopping you from adding a sourced statement like "{{As of}} <current date>, this bias has not been observed in the wild[616]" to its desciption, or some such. Paradoctor (talk) 19:28, 19 February 2020 (UTC)
So what do you think those little blue numbers in square brackets behind sentences are? ;)
Touché.- And yeah, that should work. - Ramzuiv (talk) 19:56, 19 February 2020 (UTC)
Missing Bias(es)?
I just learned about a bias titled: Truth Bias, and was surprised to not see it on this list. Any reason to not include this bias? Jdmumma (talk) 01:46, 9 March 2020 (UTC)
- Go for it. Looks like a worthy addition to Social biases. Please not capitalize, i. e. "truth bias", not "Truth Bias". Paradoctor (talk) 03:24, 9 March 2020 (UTC)
How to make sure the new cognitive biases are taken up in the beautiful and interactive brainmap?
Thy, SvenAERTS (talk) 15:43, 13 April 2020 (UTC)
- Be WP:BOLD. Open the SVG in a text editor, add the required lines, uploaded updated version on Commons. Paradoctor (talk) 15:50, 13 April 2020 (UTC)
- Ah, thx. I'm progressing. So in the https://commons.wikimedia.org/wiki/File:Cognitive_bias_codex_en.svg - there's a "Source (SVG with JavaScript)" - so you mean we should copy/paste that code, but to where?
I notice in the list of where the image is used, that in the german wikipedia they just use the english version of the image and didn't bother to translate it. Thy, SvenAERTS (talk) 16:10, 13 April 2020 (UTC)
- Directly below the image you'll find the line
Original file (SVG file, nominally 1,900 × 1,500 pixels, file size: 78 KB)
. Right-click the link, and save the file to your computer. You should then have a file named "Cognitive_bias_codex_en.svg" on your computer. This is a text file you can open with Notepad if you're running Windows, but any old text editor should do. When done editing, upload the new version, and you're done. HTH Paradoctor (talk) 16:24, 13 April 2020 (UTC)
- Directly below the image you'll find the line
Removal of the graphic
I've removed the graphic from the article again. It was removed after consensus was established here that it was based on OR, and so should not be used in the article. I see no discussion overturning that consensus, and I note that it was reinserted by a seldom-used account last year. I am aware of some canvassing via Twitter by the creator of the image asking for help getting the image reinserted into this article - someone appears to have obliged, so I am removing it again. Please don't reinstate it without gaining consensus here to do so. GirthSummit (blether) 16:00, 27 April 2020 (UTC)