Heuristics (from Ancient Greek εὑρίσκω, heurískō, "I find, discover") is the process by which humans use mental shortcuts to arrive at decisions. Heuristics are simple strategies that humans, animals,[1][2][3] organizations,[4] and even machines[5] use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution.[6][7][8][2] While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate.[9] Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete.[10] In that sense they can differ from answers given by logic and probability.
The economist and cognitive psychologist Herbert A. Simon introduced the concept of heuristics in the 1950s, suggesting there were limitations to rational decision making. In the 1970s, psychologists Amos Tversky and Daniel Kahneman added to the field with their research on cognitive bias. It was their work that introduced specific heuristic models, a field which has only expanded since. While some argue that pure laziness is behind the heuristics process, this could just be a simplified explanation for why people don't act the way we expected them too.[11] Other theories argue that it can be more accurate than decisions based on every known factor and consequence, such as the less-is-more effect.[12]
History
editHerbert A. Simon formulated one of the first models of heuristics, known as satisficing. His more general research program posed the question of how humans make decisions when the conditions for rational choice theory are not met, that is how people decide under uncertainty.[13] Simon is also known as the father of bounded rationality, which he understood as the study of the match (or mismatch) between heuristics and decision environments. This program was later extended into the study of ecological rationality.
In the early 1970s, psychologists Amos Tversky and Daniel Kahneman took a different approach, linking heuristics to cognitive biases. Their typical experimental setup consisted of a rule of logic or probability, embedded in a verbal description of a judgement problem, and demonstrated that people's intuitive judgement deviated from the rule. The "Linda problem" below gives an example. The deviation is then explained by a heuristic. This research, called the heuristics-and-biases program, challenged the idea that human beings are rational actors and first gained worldwide attention in 1974 with the Science paper "Judgment Under Uncertainty: Heuristics and Biases"[14] and although the originally proposed heuristics have been refined over time, this research program has changed the field by permanently setting the research questions.[15]
The original ideas by Herbert Simon were taken up in the 1990s by Gerd Gigerenzer and others. According to their perspective, the study of heuristics requires formal models that allow predictions of behavior to be made ex ante. Their program has three aspects:[16]
- What are the heuristics humans use? (the descriptive study of the "adaptive toolbox")
- Under what conditions should humans rely on a given heuristic? (the prescriptive study of ecological rationality)
- How to design heuristic decision aids that are easy to understand and execute? (the engineering study of intuitive design, for example with human-centered design or user-centered design approaches.)
Among others, this program has shown that heuristics can lead to fast, frugal, and accurate decisions in many real-world situations that are characterized by uncertainty.[17][18]
These two different research programs have led to two kinds of models of heuristics, formal models and informal ones. Formal models describe the decision process in terms of an algorithm, which allows for mathematical proofs and computer simulations. In contrast, informal models are verbal descriptions.
Formal models of heuristics
editList of formal models of heuristics:
- Elimination by aspects heuristic
- Fast-and-frugal trees
- Fluency heuristic
- Gaze heuristic
- Recognition heuristic
- Satisficing
- Similarity heuristic
- Take-the-best heuristic
- Tallying
Simon's satisficing strategy
editHerbert Simon's satisficing heuristic can be used to choose one alternative from a set of alternatives in situations of uncertainty.[19] Here, uncertainty means that the total set of alternatives and their consequences is not known or knowable. For instance, professional real-estate entrepreneurs rely on satisficing to decide in which location to invest to develop new commercial areas: "If I believe I can get at least x return within y years, then I take the option."[20] In general, satisficing is defined as:
- Step 1: Set an aspiration level α
- Step 2: Choose the first alternative that satisfies α
If no alternative is found, then the aspiration level can be adapted.
- Step 3: If after time β no alternative has satisfied α, then decrease α by some amount δ and return to step 1.
Satisficing has been reported across many domains, for instance as a heuristic car dealers use to price used BMWs.[21]
Elimination by aspects
editUnlike satisficing, Amos Tversky's elimination-by-aspect heuristic can be used when all alternatives are simultaneously available. The decision-maker gradually reduces the number of alternatives by eliminating alternatives that do not meet the aspiration level of a specific attribute (or aspect).[22] During a series of selections, people tend to experience uncertainty and exhibit inconsistency. Elimination by aspects could be used when facing selections. In general, the process of elimination by aspects is as follows:
- Step 1: Select one attribute related to decision making
- Step 2: Eliminate all alternatives that exclude this specific attribute
- Step 3: Use another attribute in order to further eliminate alternatives
- Step 4: Repeat step 3 until only one option is left, a decision has then been made
Elimination by aspects does not speculate that choosing alternatives could help consumers to maximize utility, on the contrary, it holds that selection is the result of a probabilistic process that gradually eliminates alternatives.[22] A simple example is given by Amos Tversky: when someone wants to purchase a new car, the first aspect they will take into account might be the automatic transmission, this will eliminate all alternatives that do not contain such an aspect. Then, when all the alternatives that do not have this feature are eliminated, another aspect will be given such as a $3000 price limit. The process of elimination continues to occur until all alternatives are eliminated.[22]
Elimination by aspects is well used in the early stage of business angels' decision-making process since it facilitates a fast-decision-making tool - alternatives will be eliminated when investors find a critical defect of the potential opportunities.[23] Another research also demonstrated that elimination by aspects has widely been used in electricity contract choice.[24] The logic behind these two examples is that elimination by aspects helps to make decisions when facing a series of complicated choices. One may need to make a decision among all alternatives while he or she only has limited intuitive computational facilities and time. However, elimination by aspects as a compensatory model could help to make such complex decisions since it is easier to apply and involves nonnumerical computations.[22]
Recognition heuristic
editThe recognition heuristic exploits the basic psychological capacity for recognition in order to make inferences about unknown quantities in the world. For two alternatives, the heuristic is:[12]
If one of two alternatives is recognized and the other not, then infer that the recognized alternative has the higher value with respect to the criterion.
For example, in the 2003 Wimbledon tennis tournament, Andy Roddick played Tommy Robredo. If one has heard of Roddick but not of Robredo, the recognition heuristic leads to the prediction that Roddick will win. The recognition heuristic exploits partial ignorance, if one has heard of both or no player, a different strategy is needed. Studies of Wimbledon 2003 and 2005 have shown that the recognition heuristic applied by semi-ignorant amateur players predicted the outcomes of all gentlemen single games as well and better than the seedings of the Wimbledon experts (who had heard of all players), as well as the ATP rankings.[25][26] The recognition heuristic is ecologically rational (that is, it predicts well) when the recognition validity is substantially above chance. In the present case, recognition of players' names is highly correlated with their chances of winning.[27]
Take-the-best
editThe take-the-best heuristic exploits the basic psychological capacity for retrieving cues from memory in the order of their validity. Based on the cue values, it infers which of two alternatives has a higher value on a criterion.[28] Unlike the recognition heuristic, it requires that all alternatives are recognized, and it thus can be applied when the recognition heuristic cannot. For binary cues (where 1 indicates the higher criterion value), the heuristic is defined as:
- Search rule: Search cues in the order of their validity
- versus
- Stopping rule: Stop search on finding the first cue that discriminates between the two alternatives (i.e., one cue values are 0 and 1).
- Decision rule: Infer that the alternative with the positive cue value (1) has the higher criterion value.
The validity vi of a cue i is defined as the proportion of correct decisions ci:
vi = ci / ti
where ti is the number of cases the values of the two alternatives differ on cue i. The validity of each cue can be estimated from samples of observation.
Take-the-best has remarkable properties. In comparison with complex machine learning models, it has been shown that it can often predict better than regression models,[29] classification-and-regression trees, neural networks, and support vector machines. [Brighton & Gigerenzer, 2015]
Similarly, psychological studies have shown that in situations where take-the-best is ecologically rational, a large proportion of people tend to rely on it. This includes decision making by airport custom officers,[30] professional burglars and police officers [31] and student populations.[32] The conditions under which take-the-best is ecologically rational are mostly known.[33] Take-the-best shows that the previous view that ignoring part of the information would be generally irrational is incorrect. Less can be more.
Fast-and-frugal trees
editA fast-and-frugal tree is a heuristic that allows to make classifications,[34] such as whether a patient with severe chest pain is likely to have a heart attack or not,[35] or whether a car approaching a checkpoint is likely to be a terrorist or a civilian.[36] It is called "fast and frugal" because, just like take-the-best, it allows for quick decisions with only few cues or attributes. It is called a "tree" because it can be represented like a decision tree in which one asks a sequence of questions. Unlike a full decision tree, however, it is an incomplete tree – to save time and reduce the danger of overfitting.
Figure 1 shows a fast-and-frugal tree used for screening for HIV (human immunodeficiency virus). Just like take-the-best, the tree has a search rule, stopping rule, and decision rule:
- Search rule: Search through cues in a specified order.
- Stopping rule: Stop search if an exit is reached.
- Decision rule: Classify the person according to the exit (here: No HIV or HIV).
In the HIV tree, an ELISA (enzyme-linked immunosorbent assay) test is conducted first. If the outcome is negative, then the testing procedure stops and the client is informed of the good news, that is, "no HIV." If, however, the result is positive, a second ELISA test is performed, preferably from a different manufacturer. If the second ELISA is negative, then the procedure stops and the client is informed of having "no HIV." However, if the result is positive, a final test, the Western blot, is conducted.
In general, for n binary cues, a fast-and-frugal tree has exactly n + 1 exits – one for each cue and two for the final cue. A full decision tree, in contrast, requires 2n exits. The order of cues (tests) in a fast-and-frugal tree is determined by the sensitivity and specificity of the cues, or by other considerations such as the costs of the tests. In the case of the HIV tree, the ELISA is ranked first because it produces fewer misses than the Western blot test, and also is less expensive. The Western blot test, in contrast, produces fewer false alarms. In a full tree, in contrast, order does not matter for the accuracy of the classifications.
Fast-and-frugal trees are descriptive or prescriptive models of decision making under uncertainty. For instance, an analysis or court decisions reported that the best model of how London magistrates make bail decisions is a fast and frugal tree.[37] The HIV tree is both prescriptive– physicians are taught the procedure – and a descriptive model, that is, most physicians actually follow the procedure.
Tallying
editTallying is a heuristic that considers the most viable choice in a decision making problem to be the one which outperforms its alternatives across most identifiable measures and criteria.[38]
As opposed to the take-the-best heuristic which considers a weighted-value when assessing the importance of a specific aspect (cues) involved in a choice, a person who tallies merely considers all available aspects of an alternative choice with equal weight and chooses the option with the most aspects in favour.[4]
In this sense, tallying differentiates from the take-the-best heuristic as the latter naturally discriminates based on the value applied to each aspect, and therefore can lead to opposing results.[39]
To represent this, consider a scenario where a prediction is taking place as to whether Team A or Team B may be more successful in the upcoming season of basketball. Team A is superior in 3/4 of the contributing aspects to team success, but the aspect Team B is greater in than Team A is weighted as objectively more important than the others for team success. The tallying heuristic would consider Team A to be more successful due to its outperformance in most measures, however, take-the-best would consider the weighted value of the singular one in which Team B is superior in to determine that Team B would be the most successful.
Informal models of heuristics
editIn their initial research, Tversky and Kahneman proposed three heuristics—availability, representativeness, and anchoring and adjustment. Subsequent work has identified many more. Heuristics that underlie judgment are called "judgment heuristics". Another type, called "evaluation heuristics", are used to judge the desirability of possible choices.[40]
List of informal models of heuristics:
- Affect heuristic: a mental shortcut which uses emotion to influence the decision. Emotion is the effect that plays the lead role that makes the decision or solves the problem quickly or efficiently. It is used while judging the risks and benefits of something, depending on the positive or negative feelings that people associate with a stimulus. It can also be considered the gut decision since if the gut feeling is right, then the benefits are high and the risks are low.[41]
- Anchoring and adjustment: describes the common human tendency to rely more heavily on the first piece of information offered (the "anchor") when making decisions. For example, in a study done with children, the children were told to estimate the number of jellybeans in a jar. Groups of children were given either a high or low "base" number (anchor). Children estimated the number of jellybeans to be closer to the anchor number that they were given.[42]
- Availability heuristic: a mental shortcut that occurs when people make judgements about the probability of events by the ease with which examples come to mind. For example, in a 1973 Tversky & Kahneman experiment, the majority of participants reported that there were more words in the English language that start with the letter K than for which K was the third letter. There are actually twice as many words in the English Language that have K as the third letter as those that start with K, but words that start with K are much easier to recall and bring to mind.[43]
- Balance heuristic: applies to when an individual balances the negative and positive effects from a decision which makes the choice obvious.[44] It is a mental shortcut that helps individuals achieve peace and harmony in their lives while simultaneously attempting to avoid the potential risks or consequences of a decision. [45]
- Base rate heuristic: when a decision involves probability this is a mental shortcut that uses relevant data to determine the probability of an outcome occurring. When using this Heuristic there is a common issue where individuals misjudge the likelihood of a situation. For example, if there is a test for a disease which has an accuracy of 90%, people may think it's a 90% they have the disease even though the disease only affects 1 in 500 people.[46]
- Common sense heuristic: used frequently by individuals when the potential outcomes of a decision appear obvious. For example, when your television remote stops working, you would probably change the batteries.[44]
- Contagion heuristic: follows the Law of Contagion or Similarity. This leads people to avoid others that are viewed as "contaminated" to the observer. This happens due to the fact of the observer viewing something that is seen as bad or to seek objects that have been associated with what seems good. Some things one can view as harmful can tend not to really be. This sometimes leads to irrational thinking on behalf of the observer.[47]
- Default heuristic: in real world models, it is common for consumers to apply this heuristic when selecting the default option regardless of whether the option was their preference.[48]
- Educated guess heuristic: when an individual responds to a decision using relevant information they have stored relating to the problem.[49]
- Effort heuristic: the worth of an object is determined by the amount of effort put into the production of the object. Objects that took longer to produce are more valuable while the objects that took less time are deemed not as valuable. Also applies to how much effort is put into achieving the product. This can be seen as the difference of working and earning the object versus finding the object on the side of the street. It can be the same object but the one found will not be deemed as valuable as the one that we earned.
- Escalation of commitment: describes the phenomenon where people justify increased investment in a decision, based on the cumulative prior investment, despite new evidence suggesting that the cost, starting today, of continuing the decision outweighs the expected benefit. This is related to the sunk cost fallacy.
- Fairness heuristic: applies to the reaction of an individual to a decision from an authoritative figure. If the decision is enacted in a fair manner the likelihood of the individual to comply voluntarily is higher than if it is unfair.[50]
- Familiarity heuristic: a mental shortcut applied to various situations in which individuals assume that the circumstances underlying the past behavior still hold true for the present situation and that the past behavior thus can be correctly applied to the new situation. Especially prevalent when the individual experiences a high cognitive load.[51]
- Naïve diversification: when asked to make several choices at once, people tend to diversify more than when making the same type of decision sequentially.
- Peak–end rule: a person's subjective perceptions during the most intense and final moments of an event are averaged together into a single judgment.[52] For example, a person might judge the difficulty of a workout by taking into consideration only the most demanding part of the workout (e.g., Tabata sprints) and what happens at the very end (e.g., a cool-down). In this way, a difficult workout such as the one described here could be perceived as "easier" than a more relaxed workout that did not vary in intensity (e.g., 45 minutes of cycling in aerobic zone 3, without cool-down).
- Representativeness heuristic: a mental shortcut used when making judgements about the probability of an event under uncertainty. Or, judging a situation based on how similar the prospects are to the prototypes the person holds in his or her mind. For example, in a 1982 Tversky and Kahneman experiment,[53] participants were given a description of a woman named Linda. Based on the description, it was likely that Linda was a feminist. Eighty to ninety percent of participants, choosing from two options, chose that it was more likely for Linda to be a feminist and a bank teller than only a bank teller. The likelihood of two events cannot be greater than that of either of the two events individually. For this reason, the representativeness heuristic is exemplary of the conjunction fallacy.[43]
- Scarcity heuristic: as in economics, the scarcer an object or event is, the more value is attributed to the object or event. The lack of abundance is an indicator of value and provides a mental shortcut that influences the subjective valuation based on how easily the thing might be replaced or lost to competitors. The scarcity heuristic is a cognitive rule that the more difficult it is to acquire an item, the more value that item must have. In many situations we use an item's availability, its perceived abundance, to quickly estimate quality and/or utility. This can lead to systematic judgement errors or cognitive bias.[54]
- Simulation heuristic: a simplified mental strategy in which people determine the likelihood of an event happening based on how easy it is to mentally picture the event happening. People regret the events that are easier to imagine over the ones that would be harder to. It is also thought that people will use this heuristic to predict the likelihood of another's behavior happening. This shows that people are constantly simulating everything around them in order to be able to predict the likelihood of events around them. It is believed that people do this by mentally undoing events that they have experienced and then running mental simulations of the events with the corresponding input values of the altered model.[55]
- Social proof: also known as the informational social influence which was named by Robert Cialdini in his 1984 book Influence. It is where people copy the actions of others. It is more prominent when people are uncertain how to behave, especially in ambiguous social situations.[56]
- Working backward heuristic: when an individual assumes they have already solved a problem they work backwards in order to find how to achieve the solution they originally figured out.[46]
Availability
editIn psychology, availability is the ease with which a particular idea can be brought to mind. When people estimate how likely or how frequent an event is on the basis of its availability, they are using the availability heuristic.[57] When an infrequent event can be brought easily and vividly to mind, this heuristic overestimates its likelihood. For example, people overestimate their likelihood of dying in a dramatic event such as a tornado or terrorism. Dramatic, violent deaths are usually more highly publicised and therefore have a higher availability.[58] On the other hand, common but mundane events are hard to bring to mind, so their likelihoods tend to be underestimated. These include deaths from suicides, strokes, and diabetes. This heuristic is one of the reasons why people are more easily swayed by a single, vivid story than by a large body of statistical evidence.[59] It may also play a role in the appeal of lotteries: to someone buying a ticket, the well-publicised, jubilant winners are more available than the millions of people who have won nothing.[58]
When people judge whether more English words begin with T or with K , the availability heuristic gives a quick way to answer the question. Words that begin with T come more readily to mind, and so subjects give a correct answer without counting out large numbers of words. However, this heuristic can also produce errors. When people are asked whether there are more English words with K in the first position or with K in the third position, they use the same process. It is easy to think of words that begin with K, such as kangaroo, kitchen, or kept. It is harder to think of words with K as the third letter, such as lake, or acknowledge, although objectively these are three times more common. This leads people to the incorrect conclusion that K is more common at the start of words.[14] In another experiment, subjects heard the names of many celebrities, roughly equal numbers of whom were male and female. The subjects were then asked whether the list of names included more men or more women. When the men in the list were more famous, a great majority of subjects incorrectly thought there were more of them, and vice versa for women. Tversky and Kahneman's interpretation of these results is that judgments of proportion are based on availability, which is higher for the names of better-known people.[57]
In one experiment that occurred before the 1976 U.S. Presidential election, some participants were asked to imagine Gerald Ford winning, while others did the same for a Jimmy Carter victory. Each group subsequently viewed their allocated candidate as significantly more likely to win. The researchers found a similar effect when students imagined a good or a bad season for a college football team.[60] The effect of imagination on subjective likelihood has been replicated by several other researchers.[59]
A concept's availability can be affected by how recently and how frequently it has been brought to mind. In one study, subjects were given partial sentences to complete. The words were selected to activate the concept either of hostility or of kindness: a process known as priming. They then had to interpret the behavior of a man described in a short, ambiguous story. Their interpretation was biased towards the emotion they had been primed with: the more priming, the greater the effect. A greater interval between the initial task and the judgment decreased the effect.[61]
Tversky and Kahneman offered the availability heuristic as an explanation for illusory correlations in which people wrongly judge two events to be associated with each other. They explained that people judge correlation on the basis of the ease of imagining or recalling the two events together.[14][57]
Representativeness
editThe representativeness heuristic is seen when people use categories, for example when deciding whether or not a person is a criminal. An individual thing has a high representativeness for a category if it is very similar to a prototype of that category. When people categorise things on the basis of representativeness, they are using the representativeness heuristic. "Representative" is here meant in two different senses: the prototype used for comparison is representative of its category, and representativeness is also a relation between that prototype and the thing being categorised.[14][62] While it is effective for some problems, this heuristic involves attending to the particular characteristics of the individual, ignoring how common those categories are in the population (called the base rates). Thus, people can overestimate the likelihood that something has a very rare property, or underestimate the likelihood of a very common property. This is called the base rate fallacy. Representativeness explains this and several other ways in which human judgments break the laws of probability.[14]
The representativeness heuristic is also an explanation of how people judge cause and effect: when they make these judgements on the basis of similarity, they are also said to be using the representativeness heuristic. This can lead to a bias, incorrectly finding causal relationships between things that resemble one another and missing them when the cause and effect are very different. Examples of this include both the belief that "emotionally relevant events ought to have emotionally relevant causes", and magical associative thinking.[63][64]
Representativeness of base rates
editA 1973 experiment used a psychological profile of Tom W., a fictional graduate student.[65] One group of subjects had to rate Tom's similarity to a typical student in each of nine academic areas (including Law, Engineering and Library Science). Another group had to rate how likely it is that Tom specialised in each area. If these ratings of likelihood are governed by probability, then they should resemble the base rates, i.e. the proportion of students in each of the nine areas (which had been separately estimated by a third group). If people based their judgments on probability, they would say that Tom is more likely to study Humanities than Library Science, because there are many more Humanities students, and the additional information in the profile is vague and unreliable. Instead, the ratings of likelihood matched the ratings of similarity almost perfectly, both in this study and a similar one where subjects judged the likelihood of a fictional woman taking different careers. This suggests that rather than estimating probability using base rates, subjects had substituted the more accessible attribute of similarity.[65]
Conjunction fallacy
editWhen people rely on representativeness, they can fall into an error which breaks a fundamental law of probability.[62] Tversky and Kahneman gave subjects a short character sketch of a woman called Linda, describing her as, "31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations". People reading this description then ranked the likelihood of different statements about Linda. Amongst others, these included "Linda is a bank teller", and, "Linda is a bank teller and is active in the feminist movement". People showed a strong tendency to rate the latter, more specific statement as more likely, even though a conjunction of the form "Linda is both X and Y" can never be more probable than the more general statement "Linda is X". The explanation in terms of heuristics is that the judgment was distorted because, for the readers, the character sketch was representative of the sort of person who might be an active feminist but not of someone who works in a bank. A similar exercise concerned Bill, described as "intelligent but unimaginative". A great majority of people reading this character sketch rated "Bill is an accountant who plays jazz for a hobby", as more likely than "Bill plays jazz for a hobby".[66]
Without success, Tversky and Kahneman used what they described as "a series of increasingly desperate manipulations" to get their subjects to recognise the logical error. In one variation, subjects had to choose between a logical explanation of why "Linda is a bank teller" is more likely, and a deliberately illogical argument which said that "Linda is a feminist bank teller" is more likely "because she resembles an active feminist more than she resembles a bank teller". Sixty-five percent of subjects found the illogical argument more convincing.[66][67] Other researchers also carried out variations of this study, exploring the possibility that people had misunderstood the question. They did not eliminate the error.[68][69] It has been shown that individuals with high CRT scores are significantly less likely to be subject to the conjunction fallacy.[70] The error disappears when the question is posed in terms of frequencies. Everyone in these versions of the study recognised that out of 100 people fitting an outline description, the conjunction statement ("She is X and Y") cannot apply to more people than the general statement ("She is X").[71]
Ignorance of sample size
editTversky and Kahneman asked subjects to consider a problem about random variation. Imagining for simplicity that exactly half of the babies born in a hospital are male, the ratio will not be exactly half in every time period. On some days, more girls will be born and on others, more boys. The question was, does the likelihood of deviating from exactly half depend on whether there are many or few births per day? It is a well-established consequence of sampling theory that proportions will vary much more day-to-day when the typical number of births per day is small. However, people's answers to the problem do not reflect this fact. They typically reply that the number of births in the hospital makes no difference to the likelihood of more than 60% male babies in one day. The explanation in terms of the heuristic is that people consider only how representative the figure of 60% is of the previously given average of 50%.[14][72]
Dilution effect
editRichard E. Nisbett and colleagues suggest that representativeness explains the dilution effect, in which irrelevant information weakens the effect of a stereotype. Subjects in one study were asked whether "Paul" or "Susan" was more likely to be assertive, given no other information than their first names. They rated Paul as more assertive, apparently basing their judgment on a gender stereotype. Another group, told that Paul's and Susan's mothers each commute to work in a bank, did not show this stereotype effect; they rated Paul and Susan as equally assertive. The explanation is that the additional information about Paul and Susan made them less representative of men or women in general, and so the subjects' expectations about men and women had a weaker effect.[73] This means unrelated and non-diagnostic information about certain issue can make relative information less powerful to the issue when people understand the phenomenon.[74]
Misperception of randomness
editRepresentativeness explains systematic errors that people make when judging the probability of random events. For example, in a sequence of coin tosses, each of which comes up heads (H) or tails (T), people reliably tend to judge a clearly patterned sequence such as HHHTTT as less likely than a less patterned sequence such as HTHTTH. These sequences have exactly the same probability, but people tend to see the more clearly patterned sequences as less representative of randomness, and so less likely to result from a random process.[14][75] Tversky and Kahneman argued that this effect underlies the gambler's fallacy; a tendency to expect outcomes to even out over the short run, like expecting a roulette wheel to come up black because the last several throws came up red.[62][76] They emphasised that even experts in statistics were susceptible to this illusion: in a 1971 survey of professional psychologists, they found that respondents expected samples to be overly representative of the population they were drawn from. As a result, the psychologists systematically overestimated the statistical power of their tests, and underestimated the sample size needed for a meaningful test of their hypotheses.[14][76]
Anchoring and adjustment
editAnchoring and adjustment is a heuristic used in many situations where people estimate a number.[77] According to Tversky and Kahneman's original description, it involves starting from a readily available number—the "anchor"—and shifting either up or down to reach an answer that seems plausible.[77] In Tversky and Kahneman's experiments, people did not shift far enough away from the anchor. Hence the anchor contaminates the estimate, even if it is clearly irrelevant. In one experiment, subjects watched a number being selected from a spinning "wheel of fortune". They had to say whether a given quantity was larger or smaller than that number. For instance, they might be asked, "Is the percentage of African countries which are members of the United Nations larger or smaller than 65%?" They then tried to guess the true percentage. Their answers correlated well with the arbitrary number they had been given.[77][78] Insufficient adjustment from an anchor is not the only explanation for this effect. An alternative theory is that people form their estimates on evidence which is selectively brought to mind by the anchor.[79]
The anchoring effect has been demonstrated by a wide variety of experiments both in laboratories and in the real world.[78][80] It remains when the subjects are offered money as an incentive to be accurate, or when they are explicitly told not to base their judgment on the anchor.[80] The effect is stronger when people have to make their judgments quickly.[81] Subjects in these experiments lack introspective awareness of the heuristic, denying that the anchor affected their estimates.[81]
Even when the anchor value is obviously random or extreme, it can still contaminate estimates.[80] One experiment asked subjects to estimate the year of Albert Einstein's first visit to the United States. Anchors of 1215 and 1992 contaminated the answers just as much as more sensible anchor years.[81] Other experiments asked subjects if the average temperature in San Francisco is more or less than 558 degrees, or whether there had been more or fewer than 100,025 top ten albums by The Beatles. These deliberately absurd anchors still affected estimates of the true numbers.[78]
Anchoring results in a particularly strong bias when estimates are stated in the form of a confidence interval. An example is where people predict the value of a stock market index on a particular day by defining an upper and lower bound so that they are 98% confident the true value will fall in that range. A reliable finding is that people anchor their upper and lower bounds too close to their best estimate.[14] This leads to an overconfidence effect. One much-replicated finding is that when people are 98% certain that a number is in a particular range, they are wrong about thirty to forty percent of the time.[14][82]
Anchoring also causes particular difficulty when many numbers are combined into a composite judgment. Tversky and Kahneman demonstrated this by asking a group of people to rapidly estimate the product 8 x 7 x 6 x 5 x 4 x 3 x 2 x 1. Another group had to estimate the same product in reverse order; 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8. Both groups underestimated the answer by a wide margin, but the latter group's average estimate was significantly smaller.[83] The explanation in terms of anchoring is that people multiply the first few terms of each product and anchor on that figure.[83] A less abstract task is to estimate the probability that an aircraft will crash, given that there are numerous possible faults each with a likelihood of one in a million. A common finding from studies of these tasks is that people anchor on the small component probabilities and so underestimate the total.[83] A corresponding effect happens when people estimate the probability of multiple events happening in sequence, such as an accumulator bet in horse racing. For this kind of judgment, anchoring on the individual probabilities results in an overestimation of the combined probability.[83]
Examples
editPeople's valuation of goods, and the quantities they buy, respond to anchoring effects. In one experiment, people wrote down the last two digits of their social security numbers. They were then asked to consider whether they would pay this number of dollars for items whose value they did not know, such as wine, chocolate, and computer equipment. They then entered an auction to bid for these items. Those with the highest two-digit numbers submitted bids that were many times higher than those with the lowest numbers.[84][85] When a stack of soup cans in a supermarket was labelled, "Limit 12 per customer", the label influenced customers to buy more cans.[81] In another experiment, real estate agents appraised the value of houses on the basis of a tour and extensive documentation. Different agents were shown different listing prices, and these affected their valuations. For one house, the appraised value ranged from US$114,204 to $128,754.[86][87]
Anchoring and adjustment has also been shown to affect grades given to students. In one experiment, 48 teachers were given bundles of student essays, each of which had to be graded and returned. They were also given a fictional list of the students' previous grades. The mean of these grades affected the grades that teachers awarded for the essay.[88]
One study showed that anchoring affected the sentences in a fictional rape trial.[89] The subjects were trial judges with, on average, more than fifteen years of experience. They read documents including witness testimony, expert statements, the relevant penal code, and the final pleas from the prosecution and defence. The two conditions of this experiment differed in just one respect: the prosecutor demanded a 34-month sentence in one condition and 12 months in the other; there was an eight-month difference between the average sentences handed out in these two conditions.[89] In a similar mock trial, the subjects took the role of jurors in a civil case. They were either asked to award damages "in the range from $15 million to $50 million" or "in the range from $50 million to $150 million". Although the facts of the case were the same each time, jurors given the higher range decided on an award that was about three times higher. This happened even though the subjects were explicitly warned not to treat the requests as evidence.[84]
Assessments can also be influenced by the stimuli provided. In one review, researchers found that if a stimulus is perceived to be important or carry "weight" to a situation, that people were more likely to attribute that stimulus as heavier physically.[90]
Affect heuristic
edit"Affect", in this context, is a feeling such as fear, pleasure or surprise. It is shorter in duration than a mood, occurring rapidly and involuntarily in response to a stimulus. While reading the words "lung cancer" might generate an affect of dread, the words "mother's love" can create an affect of affection and comfort. When people use affect ("gut responses") to judge benefits or risks, they are using the affect heuristic.[91] The affect heuristic has been used to explain why messages framed to activate emotions are more persuasive than those framed in a purely factual way.[92]
Escalation of commitment heuristic
editDecision makers, whether at an organisational or national level, can come across the dilemma of whether to continue with an operation or withdraw from it. The escalation of commitment heuristic demonstrates that people often tend to lock themselves into losing courses of action in the hopes that investing more resources into an operation will turn around losses.[93][94] Furthermore, escalation of commitment can be expected to occur in situations where the decision maker can claim credit for operational success, but losses and operational failure are directed and absorbed by others such as a larger entity.[95] Cognitive determinates that can influence escalation of commitment include self-justification, problem framing, sunk costs, goal substitution, self-efficacy, accountability, and illusion of control.[93] The general flow of events that causes implementation of the escalation of commitment heuristic are as follows:
- A large amount of resources is invested into an operation and cannot be recovered (sunk cost).
- The operation performs poorly and provides the decision maker with negative feedback.
- The decision maker continues to pour investment into the operation with the hope of turning it around, hence reflecting the escalation of commitment heuristic.[93]
Aside from being relevant to decision makers in firms and organisations, escalation of commitment is also applicable to decisions made by national leaders. An example of this is decisions relating to further investment in wars. In a war-based scenario, the costs are predominately borne by soldiers and taxpayers. Additionally, decision makers in war scenarios often do not have to directly or immediately bear the costs of their decisions at the same level as soldiers and taxpayers do, hence making their decision to keep investing easier. This reflects the escalation of commitment heuristic, and inevitably creates a cyclical process of reinvestment that has the potential to cause long-term issues economically, socially, and politically at both local and global scales.[95]
Others
editTheories
editThere are competing theories of human judgment, which differ on whether the use of heuristics is irrational. A cognitive laziness approach argues that heuristics are inevitable shortcuts given the limitations of the human brain. According to the natural assessments approach, some complex calculations are already done rapidly and automatically by the brain, and other judgments make use of these processes rather than calculating from scratch. This has led to a theory called "attribute substitution", which says that people often handle a complicated question by answering a different, related question, without being aware that this is what they are doing.[96] A third approach argues that heuristics perform just as well as more complicated decision-making procedures, but more quickly and with less information. This perspective emphasises the "fast and frugal" nature of heuristics.[97]
Cognitive laziness
editAn effort-reduction framework proposed by Anuj K. Shah and Daniel M. Oppenheimer states that people use a variety of techniques to reduce the effort of making decisions.[98]
Attribute substitution
editIn 2002 Daniel Kahneman and Shane Frederick proposed a process called attribute substitution which happens without conscious awareness. According to this theory, when somebody makes a judgment (of a target attribute) which is computationally complex, a rather more easily calculated heuristic attribute is substituted.[99] In effect, a difficult problem is dealt with by answering a rather simpler problem, without the person being aware this is happening.[96] This explains why individuals can be unaware of their own biases, and why biases persist even when the subject is made aware of them. It also explains why human judgments often fail to show regression toward the mean.[96][99][100]
This substitution is thought of as taking place in the automatic intuitive judgment system, rather than the more self-aware reflective system. Hence, when someone tries to answer a difficult question, they may actually answer a related but different question, without realizing that a substitution has taken place.[96][99]
In 1975, psychologist Stanley Smith Stevens proposed that the strength of a stimulus (e.g. the brightness of a light, the severity of a crime) is encoded by brain cells in a way that is independent of modality. Kahneman and Frederick built on this idea, arguing that the target attribute and heuristic attribute could be very different in nature.[96]
[P]eople are not accustomed to thinking hard, and are often content to trust a plausible judgment that comes to mind.
Daniel Kahneman, American Economic Review 93 (5) December 2003, p. 1450[100]
Kahneman and Frederick propose three conditions for attribute substitution:[96]
- The target attribute is relatively inaccessible.
Substitution is not expected to take place in answering factual questions that can be retrieved directly from memory ("What is your birthday?") or about current experience ("Do you feel thirsty now?). - An associated attribute is highly accessible.
This might be because it is evaluated automatically in normal perception or because it has been primed. For example, someone who has been thinking about their love life and is then asked how happy they are might substitute how happy they are with their love life rather than other areas. - The substitution is not detected and corrected by the reflective system.
For example, when asked "A bat and a ball together cost $1.10. The bat costs $1 more than the ball. How much does the ball cost?" many subjects incorrectly answer $0.10.[100] An explanation in terms of attribute substitution is that, rather than work out the sum, subjects parse the sum of $1.10 into a large amount and a small amount, which is easy to do. Whether they feel that is the right answer will depend on whether they check the calculation with their reflective system.
Kahneman gives an example where some Americans were offered insurance against their own death in a terrorist attack while on a trip to Europe, while another group were offered insurance that would cover death of any kind on the trip. Even though "death of any kind" includes "death in a terrorist attack", the former group were willing to pay more than the latter. Kahneman suggests that the attribute of fear is being substituted for a calculation of the total risks of travel.[101] Fear of terrorism for these subjects was stronger than a general fear of dying on a foreign trip.
Fast and frugal
editGerd Gigerenzer and colleagues have argued that heuristics can be used to make judgments that are accurate rather than biased. According to them, heuristics are "fast and frugal" alternatives to more complicated procedures, giving answers that are just as good.[102]
Consequences
editEfficient decision heuristics
editWarren Thorngate, a social psychologist, implemented ten simple decision rules or heuristics in a computer program. He determined how often each heuristic selected alternatives with highest-through-lowest expected value in a series of randomly-generated decision situations. He found that most of the simulated heuristics selected alternatives with highest expected value and almost never selected alternatives with lowest expected value.[103]
"Beautiful-is-familiar" effect
editPsychologist Benoît Monin reports a series of experiments in which subjects, looking at photographs of faces, have to judge whether they have seen those faces before. It is repeatedly found that attractive faces are more likely to be mistakenly labeled as familiar.[104] Monin interprets this result in terms of attribute substitution. The heuristic attribute in this case is a "warm glow"; a positive feeling towards someone that might either be due to their being familiar or being attractive. This interpretation has been criticised, because not all the variance in familiarity is accounted for by the attractiveness of the photograph.[98]
Judgments of morality and fairness
editLegal scholar Cass Sunstein has argued that attribute substitution is pervasive when people reason about moral, political or legal matters.[105] Given a difficult, novel problem in these areas, people search for a more familiar, related problem (a "prototypical case") and apply its solution as the solution to the harder problem. According to Sunstein, the opinions of trusted political or religious authorities can serve as heuristic attributes when people are asked their own opinions on a matter. Another source of heuristic attributes is emotion: people's moral opinions on sensitive subjects like sexuality and human cloning may be driven by reactions such as disgust, rather than by reasoned principles.[106] Sunstein has been challenged as not providing enough evidence that attribute substitution, rather than other processes, is at work in these cases.[98]
Persuasion
editAn example of how persuasion plays a role in heuristic processing can be explained through the heuristic-systematic model.[107] This explains how there are often two ways we are able to process information from persuasive messages, one being heuristically and the other systematically. A heuristic is when we make a quick short judgement into our decision making. On the other hand, systematic processing involves more analytical and inquisitive cognitive thinking. Individuals looks further than their own prior knowledge for the answers.[108][109] An example of this model could be used when watching an advertisement about a specific medication. One without prior knowledge would see the person in the proper pharmaceutical attire and assume that they know what they are talking about. Therefore, that person automatically has more credibility and is more likely to trust the content of the messages than they deliver. While another who is also in that field of work or already has prior knowledge of the medication will not be persuaded by the ad because of their systematic way of thinking. This was also formally demonstrated in an experiment conducted my Chaiken and Maheswaran (1994).[110] In addition to these examples, the fluency heuristic ties in perfectly with the topic of persuasion. It is described as how we all easily make "the most of an automatic by-product of retrieval from memory".[111] An example would be a friend asking about good books to read.[112] Many could come to mind, but you name the first book recalled from your memory. Since it was the first thought, therefore you value it as better than any other book one could suggest. The effort heuristic is almost identical to fluency. The one distinction would be that objects that take longer to produce are seen with more value. One may conclude that a glass vase is more valuable than a drawing, merely because it may take longer for the vase. These two varieties of heuristics confirms how we may be influenced easily our mental shortcuts, or what may come quickest to our mind.[113]
This section needs expansion. You can help by adding to it. (October 2016) |
See also
edit- Behavioral economics – Academic discipline
- Bounded rationality – Making of satisfactory, not optimal, decisions
- Debiasing – Reduction of bias
- Ecological rationality
- Great Rationality Debate – Question of whether humans are rational or not
- Intuitive statistics – cognitive phenomenon where organisms use data to make generalizations and predictions about the world
- List of cognitive biases – Systematic patterns of deviation from norm or rationality in judgment
- List of memory biases – Systematic patterns of deviation from norm or rationality in judgment
- Low information voter – Voters who are poorly informed
Citations
edit- ^ Marsh, Barnaby (1 January 2002). "Do Animals Use Heuristics?". Journal of Bioeconomics. 4 (1): 49–56. doi:10.1023/A:1020655022163. ISSN 1573-6989. S2CID 142852213.
- ^ a b Gigerenzer, Gerd; Brighton, Henry (2009). "Homo Heuristicus: Why Biased Minds Make Better Inferences". Topics in Cognitive Science. 1 (1): 107–143. doi:10.1111/j.1756-8765.2008.01006.x. hdl:11858/00-001M-0000-0024-F678-0. ISSN 1756-8765. PMID 25164802.
- ^ Hutchinson, John M. C.; Gigerenzer, Gerd (31 May 2005). "Simple heuristics and rules of thumb: Where psychologists and behavioural biologists might meet". Behavioural Processes. Proceedings of the meeting of the Society for the Quantitative Analyses of Behavior (SQAB 2004). 69 (2): 97–124. doi:10.1016/j.beproc.2005.02.019. ISSN 0376-6357. PMID 15845293. S2CID 785187.
- ^ a b Gigerenzer, Gerd; Gaissmaier, Wolfgang (2011). "Heuristic Decision Making". Annual Review of Psychology. 62 (1): 451–482. doi:10.1146/annurev-psych-120709-145346. hdl:11858/00-001M-0000-0024-F16D-5. PMID 21126183.
- ^ Braun, T.D.; Siegal, H.J.; Beck, N.; Boloni, L.L.; Maheswaran, M.; Reuther, A.I.; Robertson, J.P.; Theys, M.D.; Bin Yao; Hensgen, D.; Freund, R.F. (1999). "A comparison study of static mapping heuristics for a class of meta-tasks on heterogeneous computing systems". Proceedings. Eighth Heterogeneous Computing Workshop (HCW'99). IEEE Comput. Soc. pp. 15–29. doi:10.1109/hcw.1999.765093. hdl:10945/35227. ISBN 0-7695-0107-9. S2CID 2860157.
- ^ Alan, Lewis (2018). The Cambridge Handbook of Psychology and Economic Behavior. Cambridge University Press. p. 43. ISBN 978-0-521-85665-2.
- ^ Harris, Lori A. (2005). CliffsAP Psychology. John Wiley & Sons. p. 65. ISBN 978-0-7645-7316-3.
- ^ Nevid, Jeffery (2008). Psychology: Concepts and Applications. Cengage Learning. p. 251. ISBN 978-0-547-14814-4.
- ^ Goldstein, E. Bruce (23 July 2018). Cognitive psychology: Connecting mind, research, and everyday experience. Cengage Learning. ISBN 978-1-337-40827-1. OCLC 1055681278.
- ^ Scholz, Roland W. (1983). "Introduction to Decision Making under Uncertainty: Biases, Fallacies and the Development of Decision Making". In R. W. Scholz (ed.). Decision Making under Uncertainty: Cognitive Decision Research, Social Interaction, Development and Epistemology. Advances in Psychlogy, 16. Elsevier. pp. 3–18. ISBN 978-0-08-086670-3.
- ^ Madsen, Thomas (1 September 2018). "The Conception of Laziness and the Characterisation of Others as Lazy". Human Arenas. 1 (3): 288–304. doi:10.1007/s42087-018-0018-6. ISSN 2522-5804.
- ^ a b Goldstein, Daniel G.; Gigerenzer, Gerd (2002). "Models of ecological rationality: The recognition heuristic". Psychological Review. 109 (1): 75–90. doi:10.1037/0033-295x.109.1.75. hdl:11858/00-001M-0000-0025-9128-B. ISSN 0033-295X. PMID 11863042.
- ^ Simon, Herbert A. (1989). The Scientist as Problem Solver (Report). Pittsburgh, PA, United States: Artificial Intelligence and Psychology Project, Carnegie-Mellon University. doi:10.21236/ada240569. Technical report AIP-3. (PDF file direct download – via Defense Technical Information Center.)
- ^ a b c d e f g h i j Tversky & Kahneman 1974.
- ^ Fiedler, Klaus; von Sydow, Momme (2015). "Heuristics and Biases: Beyond Tversky and Kahneman's (1974) Judgment under Uncertainty" (PDF). In Eysenck, Michael W.; Groome, David (eds.). Cognitive Psychology: Revising the Classical Studies. Sage, London. pp. 146–161. ISBN 978-1-4462-9447-5. Archived from the original (PDF) on 15 December 2017. Retrieved 14 August 2015.
- ^ Simple Heuristics that Make Us Smart. Evolution and Cognition. Oxford, New York: Oxford University Press. 1999. ISBN 9780195143812.
- ^ Gigerenzer, Gerd; Hertwig, Ralph; Pachur, Thorsten, eds. (2011). "Part III – Heuristics in the Wild". Heuristics: The Foundations of Adaptive Behavior. Oxford University Press. doi:10.1093/acprof:oso/9780199744282.001.0001. hdl:11858/00-001M-0000-0024-F172-8. ISBN 978-0-19-989472-7.
- ^ Brighton, Henry; Gigerenzer, Gerd (January 2011). "Towards Competitive Instead of Biased Testing of Heuristics: A Reply to Hilbig and Richter (2011)". Topics in Cognitive Science. 3 (1): 197–205. doi:10.1111/j.1756-8765.2010.01124.x. hdl:11858/00-001M-0000-0024-F0FF-3. PMID 25164182.
- ^ Simon, Herbert A. (1955). "A Behavioral Model of Rational Choice". The Quarterly Journal of Economics. 69 (1): 99–118. doi:10.2307/1884852. ISSN 0033-5533. JSTOR 1884852.
- ^ Berg, Nathan (2014). "Success from satisficing and imitation: Entrepreneurs' location choice and implications of heuristics for local economic development" (PDF). Journal of Business Research. 67 (8): 1700–1709. doi:10.1016/j.jbusres.2014.02.016. ISSN 0148-2963.
- ^ Gigerenzer, Gerd; Artinger, Florian M. (2016). Heuristic Pricing in an Uncertain Market: Ecological and Constructivist Rationality (Report). SSRN 2938702 – via SSRN.
- ^ a b c d Tversky, Amos (1972). "Elimination by aspects: A theory of choice". Psychological Review. 79 (4): 281–299. doi:10.1037/h0032955. ISSN 0033-295X.
- ^ Maxwell, Andrew L.; Jeffrey, Scott A.; Lévesque, Moren (March 2011). "Business angel early stage decision making". Journal of Business Venturing. 26 (2): 212–225. doi:10.1016/j.jbusvent.2009.09.002.
- ^ Daniel, Aemiro Melkamu; Persson, Lars; Sandorf, Erlend Dancke (June 2018). "Accounting for elimination-by-aspects strategies and demand management in electricity contract choice". Energy Economics. 73: 80–90. Bibcode:2018EneEc..73...80D. doi:10.1016/j.eneco.2018.05.009.
- ^ Serwe, Sascha; Frings, Christian (2006). "Who will win Wimbledon? The recognition heuristic in predicting sports events". Journal of Behavioral Decision Making. 19 (4): 321–332. doi:10.1002/bdm.530. ISSN 0894-3257.
- ^ Scheibehenne, Benjamin; Bröder, Arndt (2007). "Predicting Wimbledon 2005 tennis results by mere player name recognition". International Journal of Forecasting. 23 (3): 415–426. doi:10.1016/j.ijforecast.2007.05.006. ISSN 0169-2070.
- ^ Gigerenzer, Gerd (2011). "The recognition heuristic: A decade of research". Judgment and Decision Making. 6: 100–121. doi:10.1017/S1930297500002126. hdl:11858/00-001M-0000-0024-F105-B. S2CID 1291701.
- ^ Gigerenzer, G.; Goldstein, D. G. (1996). "Reasoning the fast and frugal way: Models of bounded rationality". Psychological Review. 103 (4): 34–59. doi:10.1037/0033-295X.103.4.650. hdl:21.11116/0000-0000-B771-2. PMID 8888650.
- ^ Czerlinski, J.; Gigerenzer, G.; Goldstein, D. G. (1999), "How good are simple heuristics?", in G. Gigerenzer; P. M. Todd; ABC Research Group (eds.), Simple heuristics that make us smart, New York: Oxford University Press, pp. 97–118
- ^ Pachur, T.; Marinello, G. (2013). "Expert intuitions: How to model the decision strategies of airport customs officers ?". Acta Psychologica. 144 (1): 97–103. doi:10.1016/j.actpsy.2013.05.003. PMID 23787151.
- ^ Bergert, F. Bryan; Nosofsky, Robert M. (2007). "A response-time approach to comparing generalized rational and take-the-best models of decision making". Journal of Experimental Psychology: Learning, Memory, and Cognition. 33 (1): 107–129. doi:10.1037/0278-7393.33.1.107. PMID 17201556.
- ^ Bröder, A. (2012), "The quest for take-the-best", in P. M. Todd; G. Gigerenzer; ABC Research Group (eds.), Ecological rationality: Intelligence in the world, New York: Oxford University Press, pp. 216–240
- ^ Gigerenzer, G. (2016), "Towards a rational theory of heuristics", in R. Frantz; L. Marsh (eds.), Minds, models, and milieux: Commemorating the centennial of the birth of Herbert Simon, New York: Palgrave Macmillan, pp. 34–59
- ^ Martignon L, Vitouch O, Takezawa M, Forster M (2003). "Naïve and yet enlightened: From natural frequencies to fast and frugal decision trees". In Hardman D, Macchi L (eds.). Thinking: Psychological perspectives on reasoning, judgment, and decision making. pp. 189–211.
- ^ Green, L.; Mehr, D.R. (2003). "What alters physicians' decisions to admit to the coronary care unit?". The Journal of Family Practice. 45 (3): 219–226. PMID 9300001.
- ^ Keller, Niklas; Katsikopoulos, Konstantinos V. (2016). "On the role of psychological heuristics in operational research; and a demonstration in military stability operations". European Journal of Operational Research. 249 (3): 1063–1073. doi:10.1016/j.ejor.2015.07.023.
- ^ Dhami, Mandeep K. (2003). "Psychological Models of Professional Decision Making". Psychological Science. 14 (2): 175–180. doi:10.1111/1467-9280.01438. PMID 12661681. S2CID 16129660.
- ^ Bodemer, Nicolai; Hanoch, Yaniv; Katsikopoulos, Konstantinos V. (2015). "Heuristics: foundations for a novel approach to medical decision making". Internal and Emergency Medicine. 10 (2): 195–203. doi:10.1007/s11739-014-1143-y. ISSN 1828-0447. PMID 25348561. S2CID 1245264.
- ^ Bobadilla-Suarez, Sebastian; Love, Bradley C. (2018). "Fast or frugal, but not both: Decision heuristics under time pressure". Journal of Experimental Psychology: Learning, Memory, and Cognition. 44 (1): 24–33. doi:10.1037/xlm0000419. ISSN 1939-1285. PMC 5708146. PMID 28557503.
- ^ Hastie & Dawes 2009, pp. 210–211
- ^ Slovic, Paul; Finucane, Melissa L.; Peters, Ellen; MacGregor, Donald G. (March 2007). "The affect heuristic". European Journal of Operational Research. 177 (3): 1333–1352. doi:10.1016/j.ejor.2005.04.006. ISSN 0377-2217. S2CID 1941040.
- ^ Smith, H. (1999). "Use of the anchoring and adjustment heuristic by children". Current Psychology. 18 (3): 294–300. doi:10.1007/s12144-999-1004-4. S2CID 144901306.
- ^ a b Harvey, N (2007). "Use of heuristics: Insights from forecasting research". Thinking & Reasoning. 13 (1): 5–24. doi:10.1080/13546780600872502. S2CID 62523068.
- ^ a b Ross, Derek (2012). "Ambiguous Weighting and Nonsensical Sense: The Problems of 'Balance' and 'Common Sense' as Commonplace Concepts and Decision-making Heuristics in Environmental Rhetoric". Social Epistemology. 26: 115–144. doi:10.1080/02691728.2011.634530. S2CID 145239368.
- ^ Cheng, Yin-Hui; Chang, Shin-Shin; Chuang, Shih-Chieh; Yu, Ming-Wei (July 2012). "The impact of purchase quantity on the compromise effect: The balance heuristic". Judgment and Decision Making. 7 (4): 499–512. doi:10.1017/S1930297500002837. ISSN 1930-2975.
- ^ a b Dale, Stephen (29 July 2018). "Heuristics and Biases – The Science Of Decision Making". The Future Of Work. Retrieved 25 April 2021.
- ^ Rozin, Paul; Nemeroff, Carol (8 July 2002). "Sympathetic Magical Thinking: The Contagion and Similarity "Heuristics"". Heuristics and Biases. Cambridge University Press. pp. 201–216. doi:10.1017/cbo9780511808098.013. ISBN 978-0-52179-260-8.
- ^ Bateman, Hazel (2017). "Default and naive diversification heuristics in annuity choice". Australian Journal of Management. 42: 32–57. doi:10.1177/0312896215617225. S2CID 220081277.
- ^ Nadeau, Richard (1995). "Educated Guesses: The Process of Answering Factual Knowledge Questions in Surveys". Public Opinion Quarterly. 59 (3): 323–346. doi:10.1086/269480.
- ^ van Dijke, Marius (2010). "Trust in authorities as a boundary condition to procedural fairness effects on tax compliance". Journal of Economic Psychology. 31: 80–91. doi:10.1016/j.joep.2009.10.005.
- ^ Park, C. Whan; Lessig, V. Parker (September 1981). "Familiarity and Its Impact on Consumer Decision Biases and Heuristics". Journal of Consumer Research. 8 (2): 223. doi:10.1086/208859. hdl:1808/10100. ISSN 0093-5301.
- ^ Kahneman, Daniel; Fredrickson, Barbara L.; Schreiber, Charles A.; Redelmeier, Donald A. (1993). "When more pain is preferred to less: Adding a better end". Psychological Science. 4 (6): 401–405. doi:10.1111/j.1467-9280.1993.tb00589.x. S2CID 8032668.
- ^ Kahneman, Slovic & Tversky 1982.
- ^ Lynn, Michael (March 1992). "The Psychology of Unavailability: Explaining Scarcity and Cost Effects on Value". Basic and Applied Social Psychology. 13 (1): 3–7. doi:10.1207/s15324834basp1301_2. hdl:1813/71653. ISSN 0197-3533. S2CID 131769856.
- ^ Kahneman, Daniel; Tversky, Amos (15 May 1981). "Variants of Uncertainty". Cognition. 11 (2). Fort Belvoir, VA: 143–157. doi:10.21236/ada099503. PMID 7198958.
- ^ Cialdini, Robert B.; Wosinska, Wilhelmina; Barrett, Daniel W.; Butner, Jonathan; Gornik-Durose, Malgorzata (October 1999). "Compliance with a Request in Two Cultures: The Differential Influence of Social Proof and Commitment/Consistency on Collectivists and Individualists". Personality and Social Psychology Bulletin. 25 (10): 1242–1253. doi:10.1177/0146167299258006. ISSN 0146-1672. S2CID 143225569.
- ^ a b c Tversky, Amos; Kahneman, Daniel (1973). "Availability: A Heuristic for Judging Frequency and Probability". Cognitive Psychology. 5 (2): 207–232. doi:10.1016/0010-0285(73)90033-9. ISSN 0010-0285. S2CID 41668623.
- ^ a b Sutherland 2007, pp. 16–17
- ^ a b Plous 1993, pp. 123–124
- ^ Carroll, J. (1978). "The Effect of Imagining an Event on Expectations for the Event: An Interpretation in Terms of the Availability Heuristic". Journal of Experimental Social Psychology. 14 (1): 88–96. doi:10.1016/0022-1031(78)90062-8. ISSN 0022-1031.
- ^ Srull, Thomas K.; Wyer, Robert S. (1979). "The Role of Category Accessibility in the Interpretation of Information About Persons: Some Determinants and Implications". Journal of Personality and Social Psychology. 37 (10): 1660–1672. CiteSeerX 10.1.1.335.4255. doi:10.1037/0022-3514.37.10.1660. ISSN 0022-3514.
- ^ a b c Plous 1993, pp. 109–120
- ^ Nisbett, Richard E.; Ross, Lee (1980). Human inference: strategies and shortcomings of social judgment. Englewood Cliffs, NJ: Prentice-Hall. pp. 115–118. ISBN 9780134450735.
- ^ Gilovich, Thomas; Savitsky, Kenneth (8 July 2002). "Like Goes with Like: The Role of Representativeness in Erroneous and Pseudo-Scientific Beliefs". In Gilovich, Thomas; Griffin, Dale; Kahneman, Daniel (eds.). Heuristics and Biases (1st ed.). Cambridge University Press. pp. 617–624. doi:10.1017/cbo9780511808098.036. ISBN 978-0-521-79260-8.
- ^ a b Kahneman, Daniel; Amos Tversky (July 1973). "On the Psychology of Prediction". Psychological Review. 80 (4): 237–251. doi:10.1037/h0034747. ISSN 0033-295X.
- ^ a b Tversky, Amos; Kahneman, Daniel (1983), "Extensional versus intuitive reasoning: The conjunction fallacy in probability judgment", Psychological Review, 90 (4): 293–315, doi:10.1037/0033-295X.90.4.293 reprinted in Gilovich, Griffin & Kahneman (2002), pp. 19–48.
- ^ Poundstone 2010, p. 89
- ^ Tentori, K.; Bonini, N.; Osherson, D. (1 May 2004). "The conjunction fallacy: a misunderstanding about conjunction?". Cognitive Science. 28 (3): 467–477. doi:10.1016/j.cogsci.2004.01.001.
- ^ Moro, Rodrigo (29 July 2008). "On the nature of the conjunction fallacy". Synthese. 171 (1): 1–24. doi:10.1007/s11229-008-9377-8. hdl:11336/69232. S2CID 207244869.
- ^ Oechssler, Jörg; Roider, Andreas; Schmitz, Patrick W. (2009). "Cognitive abilities and behavioral biases" (PDF). Journal of Economic Behavior & Organization. 72 (1): 147–152. doi:10.1016/j.jebo.2009.04.018. ISSN 0167-2681.
- ^ Gigerenzer, Gerd (1991). "How to make cognitive illusions disappear: Beyond "heuristics and biases". European Review of Social Psychology. 2: 83–115. CiteSeerX 10.1.1.336.9826. doi:10.1080/14792779143000033.
- ^ Kunda 1999, pp. 70–71
- ^ Kunda 1999, pp. 68–70
- ^ Zukier, Henry (1982). "The dilution effect: The role of the correlation and the dispersion of predictor variables in the use of nondiagnostic information". Journal of Personality and Social Psychology. 43 (6): 1163–1174. doi:10.1037/0022-3514.43.6.1163.
- ^ Kunda 1999, pp. 71–72
- ^ a b Tversky, Amos; Kahneman, Daniel (1971), "Belief in the law of small numbers", Psychological Bulletin, 76 (2): 105–110, CiteSeerX 10.1.1.592.3838, doi:10.1037/h0031322, S2CID 5883140, reprinted in Kahneman, Slovic & Tversky (1982), pp. 23–31.
- ^ a b c Baron 2000, p. 235?
- ^ a b c Plous 1993, pp. 145–146
- ^ Koehler & Harvey 2004, p. 99
- ^ a b c Mussweiler, Englich & Strack 2004, pp. 185–186, 197
- ^ a b c d Yudkowsky 2011, pp. 102–103
- ^ Lichtenstein, Sarah; Fischoff, Baruch; Phillips, Lawrence D. "Calibration of probabilities: The state of the art to 1980". In Kahneman, Slovic & Tversky (1982), pp. 306–334.
- ^ a b c d Sutherland 2007, pp. 168–170
- ^ a b Hastie & Dawes 2009, pp. 78–80
- ^ George Loewenstein (2007), Exotic Preferences: Behavioral Economics and Human Motivation, Oxford University Press, pp. 284–285, ISBN 9780199257072
- ^ Mussweiler, Englich & Strack 2004, p. 188
- ^ Plous 1993, pp. 148–149
- ^ Caverni, Jean-Paul; Péris, Jean-Luc (1990), "The Anchoring-Adjustment Heuristic in an 'Information-Rich, Real World Setting': Knowledge Assessment by Experts", in Caverni, Jean-Paul; Fabré, Jean-Marc; González, Michel (eds.), Cognitive biases, Elsevier, pp. 35–45, ISBN 9780444884138
- ^ a b Mussweiler, Englich & Strack 2004, p. 183
- ^ Rabelo, A. L.; Keller, V. N.; Pilati, R.; Wicherts, J. M. (2015). "No effect of weight on judgments of importance in the moral domain and evidence of publication bias from a meta-analysis". PLOS ONE. 10 (8): e0134808. Bibcode:2015PLoSO..1034808R. doi:10.1371/journal.pone.0134808. PMC 4524628. PMID 26241042.
- ^ Finucane, M.L.; Alhakami, A.; Slovic, P.; Johnson, S.M. (January 2000). "The Affect Heuristic in Judgment of Risks and Benefits". Journal of Behavioral Decision Making. 13 (1): 1–17. CiteSeerX 10.1.1.390.6802. doi:10.1002/(SICI)1099-0771(200001/03)13:1<1::AID-BDM333>3.0.CO;2-S.
- ^ Keller, Carmen; Siegrist, Michael; Gutscher, Heinz (June 2006). "The Role of Affect and Availability Heuristics in Risk Analysis". Risk Analysis. 26 (3): 631–639. CiteSeerX 10.1.1.456.4562. doi:10.1111/j.1539-6924.2006.00773.x. PMID 16834623. S2CID 16773932.
- ^ a b c Wong, Kin Fai Ellick; Yik, Michelle; Kwong, Jessica Y. Y. (2006). "Understanding the emotional aspects of escalation of commitment: The role of negative affect". Journal of Applied Psychology. 91 (2): 282–297. doi:10.1037/0021-9010.91.2.282. ISSN 1939-1854. PMID 16551184.
- ^ Wieber, Frank; Thürmer, J. Lukas; Gollwitzer, Peter M. (2015). "Attenuating the Escalation of Commitment to a Faltering Project in Decision-Making Groups: An Implementation Intention Approach". Social Psychological and Personality Science. 6 (5): 587–595. doi:10.1177/1948550614568158. ISSN 1948-5506. S2CID 10919912.
- ^ a b Michailova, Snejina (2022). "An Attempt to Understand the War in Ukraine – An Escalation of Commitment Perspective". British Journal of Management. 33 (4): 1673–1677. doi:10.1111/1467-8551.12633. hdl:2292/65232. ISSN 1045-3172. S2CID 250064139.
- ^ a b c d e f Kahneman, Daniel; Frederick, Shane. "Representativeness Revisited: Attribute Substitution in Intuitive Judgment". In Gilovich, Griffin & Kahneman (2002), pp. 49–81.
- ^ Hardman 2009, pp. 13–16
- ^ a b c Shah, Anuj K.; Daniel M. Oppenheimer (March 2008). "Heuristics Made Easy: An Effort-Reduction Framework". Psychological Bulletin. 134 (2): 207–222. doi:10.1037/0033-2909.134.2.207. ISSN 1939-1455. PMID 18298269.
- ^ a b c Newell, Benjamin R.; David A. Lagnado; David R. Shanks (2007). Straight choices: the psychology of decision making. Routledge. pp. 71–74. ISBN 9781841695884.
- ^ a b c Kahneman, Daniel (December 2003). "Maps of Bounded Rationality: Psychology for Behavioral Economics" (PDF). American Economic Review. 93 (5): 1449–1475. CiteSeerX 10.1.1.194.6554. doi:10.1257/000282803322655392. ISSN 0002-8282. S2CID 15131441. Archived from the original (PDF) on 19 February 2018. Retrieved 15 November 2014.
- ^ Kahneman, Daniel (2007). "Short Course in Thinking About Thinking". Edge.org. Edge Foundation. Retrieved 3 June 2009.
- ^ Gerd Gigerenzer, Peter M. Todd, and the ABC Research Group (1999). Simple Heuristics That Make Us Smart. Oxford, UK, Oxford University Press. ISBN 0-19-514381-7
- ^ Thorngate, Warren (1980). "Efficient decision heuristics". Behavioral Science. 25 (3): 219–225. doi:10.1002/bs.3830250306.
- ^ Monin, Benoît; Daniel M. Oppenheimer (2005). "Correlated Averages vs. Averaged Correlations: Demonstrating the Warm Glow Heuristic Beyond Aggregation" (PDF). Social Cognition. 23 (3): 257–278. doi:10.1521/soco.2005.23.3.257. ISSN 0278-016X. Archived from the original (PDF) on 27 May 2016. Retrieved 9 July 2010.
- ^ Sunstein, Cass R. (2005). "Moral heuristics". Behavioral and Brain Sciences. 28 (4): 531–542. doi:10.1017/S0140525X05000099. ISSN 0140-525X. PMID 16209802. S2CID 231738548.
- ^ Sunstein, Cass R. (2009). "Some Effects of Moral Indignation on Law" (PDF). Vermont Law Review. 33 (3). Vermont Law School: 405–434. SSRN 1401432. Archived from the original (PDF) on 29 November 2014. Retrieved 15 September 2009.
- ^ TODOROV, ALEXANDER; CHAIKEN, SHELLY; HENDERSON, MARLONE D. (2002). "The Heuristic-Systematic Model of Social Information Processing". The Persuasion Handbook: Developments in Theory and Practice. SAGE Publications, Inc. pp. 195–212. doi:10.4135/9781412976046.n11. ISBN 978-0-7619-2006-9.
- ^ Chen, Serena; Duckworth, Kimberly; Chaiken, Shelly (January 1999). "Motivated Heuristic and Systematic Processing". Psychological Inquiry. 10 (1): 44–49. doi:10.1207/s15327965pli1001_6. ISSN 1047-840X.
- ^ Chaiken, Shelly (1980). "Heuristic versus systematic information processing and the use of source versus message cues in persuasion". Journal of Personality and Social Psychology. 39 (5): 752–766. doi:10.1037/0022-3514.39.5.752. ISSN 1939-1315. S2CID 39212150.
- ^ Chaiken, Shelly; Ledgerwood, Chaiken (2007). "Dual Process Theories". Encyclopedia of Social Psychology. SAGE Publications. doi:10.4135/9781412956253.n164. ISBN 978-1-4129-1670-7.
- ^ Hertwig, Ralph; Herzog, Stefan M.; Schooler, Lael J.; Reimer, Torsten (2008). "Fluency heuristic: A model of how the mind exploits a by-product of information retrieval". Journal of Experimental Psychology: Learning, Memory, and Cognition. 34 (5): 1191–1206. doi:10.1037/a0013025. hdl:11858/00-001M-0000-0024-FC25-9. ISSN 1939-1285. PMID 18763900.
- ^ Yocco, Victor (2 July 2015). "Think Fast! Using Heuristics To Increase Use Of Your Product". Smashing Magazine. Retrieved 10 April 2020.
- ^ Bower, Gordon H. (1984). The psychology of learning and motivation: Advances in research and theory. Volume 18. Orlando: Academic Press. ISBN 978-0-08-086369-6. OCLC 646758779.
References
edit- Baron, Jonathan (2000), Thinking and deciding (3rd ed.), New York: Cambridge University Press, ISBN 978-0521650304, OCLC 316403966
- Gilovich, Thomas; Griffin, Dale W.; Kahneman, Daniel, eds. (2002), Heuristics and Biases: The Psychology of Intuitive Judgment, Cambridge University Press, ISBN 9780521796798, OCLC 47364085
- Hardman, David (2009), Judgment and decision making: psychological perspectives, Wiley-Blackwell, ISBN 9781405123983
- Hastie, Reid; Dawes, Robyn M. (29 September 2009), Rational Choice in an Uncertain World: The Psychology of Judgment and Decision Making, SAGE, ISBN 9781412959032
- Koehler, Derek J.; Harvey, Nigel (2004), Blackwell handbook of judgment and decision making, Wiley-Blackwell, ISBN 9781405107464
- Kahneman, Daniel; Slovic, Paul; Tversky, Amos, eds. (1982). Judgment Under Uncertainty: Heuristics and Biases. Cambridge: Cambridge University Press. ISBN 9780521284141.
- Kunda, Ziva (1999), Social Cognition: Making Sense of People, MIT Press, ISBN 978-0-262-61143-5, OCLC 40618974
- Mussweiler, Thomas; Englich, Birte; Strack, Fritz (2004), "Anchoring effect", in Pohl, Rüdiger F. (ed.), Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory, Hove, UK: Psychology Press, pp. 183–200, ISBN 9781841693514, OCLC 55124398
- Plous, Scott (1993), The Psychology of Judgment and Decision Making, McGraw-Hill, ISBN 9780070504776, OCLC 26931106
- Poundstone, William (2010), Priceless: the myth of fair value (and how to take advantage of it), Hill and Wang, ISBN 9780809094691
- Sutherland, Stuart (2007), Irrationality (2nd ed.), London: Pinter and Martin, ISBN 9781905177073, OCLC 72151566
- Tversky, Amos; Kahneman, Daniel (1974), "Judgments Under Uncertainty: Heuristics and Biases" (PDF), Science, 185 (4157): 1124–1131, Bibcode:1974Sci...185.1124T, doi:10.1126/science.185.4157.1124, PMID 17835457, S2CID 143452957, archived from the original (PDF) on 28 May 2019, retrieved 13 November 2014, reprinted in Kahneman, Slovic & Tversky (1982), pp. 3–20.
- Yudkowsky, Eliezer (2011). "Cognitive biases potentially affecting judgment of global risks". In Bostrom, Nick; Cirkovic, Milan M. (eds.). Global Catastrophic Risks. OUP Oxford. pp. 91–119. ISBN 978-0-19-960650-4.
Further reading
edit- Gilovich, Thomas; Griffin, Dale W. "Introduction – Heuristics and Biases: Then and Now". In Gilovich, Griffin & Kahneman (2002), pp. 1–18.
- Slovic, Paul; Melissa Finucane; Ellen Peters; Donald G. MacGregor. "The Affect Heuristic". In Gilovich, Griffin & Kahneman (2002), pp. 397–420.
- Reber, Rolf (2004), "Availability", in Pohl, Rüdiger F. (ed.), Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory, Hove, UK: Psychology Press, pp. 147–163, ISBN 9781841693514, OCLC 55124398
- Teigen, Karl Halvor (2004), "Judgements by representativeness", in Pohl, Rüdiger F. (ed.), Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory, Hove, UK: Psychology Press, pp. 165–182, ISBN 9781841693514, OCLC 55124398
- Gigerenzer, Gerd; Selten, Reinhard (2001). Bounded rationality: The adaptive toolbox. Cambridge, MA: MIT Press. ISBN 0585388288. OCLC 49569412.
- Korteling, Johan E.; Brouwer, Anne-Marie; Toet, Alexander (3 September 2018). "A Neural Network Framework for Cognitive Bias". Frontiers in Psychology. 9: 1561. doi:10.3389/fpsyg.2018.01561. PMC 6129743. PMID 30233451.
- Chow, Sheldon (20 April 2011). "Heuristics, Concepts, and Cognitive Architecture: Toward Understanding How The Mind Works". Electronic Thesis and Dissertation Repository.
- Todd, P. M. (2001). "Heuristics for Decision and Choice". International Encyclopedia of the Social & Behavioral Sciences. pp. 6676–6679. doi:10.1016/B0-08-043076-7/00629-X. ISBN 978-0-08-043076-8.
External links
edit- "Test Yourself: Decision Making and the Availability Heuristic". Annenberg Learner. Annenberg Foundation