Sunteți pe pagina 1din 10

Assignment

Common Decision Making Errors and Biases


Name: Sabah Afzal Class: 1-A Roll no: 77136 Reg no: 0051 Sub: Management Submitted to: Sir Farooq Qaiser

Common Decision Making Errors and Biases


Decision making engage in bounded rationally, but an accumulating body of research tells us that decision makers also allow systematic biases and errors to creep into their judgments. These come out of attempts to shortcut the decision process. To minimize effort and avoid difficult trade offs people tend to rely too heavily on experience, impulses, gut feelings, and convenient rules of thumb. In many instances, these shortcuts are helpful. However, they can lead to severe distortions from rationality. The following is an extensive list of biases:

Social Biases:
Attributional biases:
Attributional biases typically take the form of actor/observer differences: people involved in an action (actors) view things differently from people not involved (observers).

Actor-observer bias:
The tendency for explanations for other individuals behaviours to overemphasise the influence of their personality and underemphasise the influence of their situation. This is coupled with the opposite tendency for the self in that ones explanations for their own behaviours overemphasise their situation and underemphasise the influence of their personality.

Dunning-Kruger effect:
When people are incompetent in the strategies they adopt to achieve success and satisfaction, they suffer a dual burden: Not only do they reach erroneous conclusions and make unfortunate choices, but their incompetence robs them of the ability to realise it. Instead, they are left with the mistaken impression that they are doing just fine.

Egocentric bias:
Occurs when people claim more responsibility for themselves for the results of a joint action than an outside observer would.

Forer effect:
the tendency to give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically for them, but are in fact vague and general enough to apply to a wide range of people. For example, horoscopes

False consensus effect:


The tendency for people to overestimate the degree to which others agree with them.

Fundamental attribution error:


The tendency for people to over-emphasise personality-based explanations for behaviours observed in others while under-emphasising the role and power of situational influences on the same behaviour.

Halo effect:
The tendency for a persons positive or negative traits to spill over from one area of their personality to another in others perceptions of them

Herd instinct:
Common tendency to adopt the opinions and follow the behaviours of the majority to feel safer and to avoid conflict.

Illusion of asymmetric insight:


People perceive their knowledge of their peers to surpass their peers knowledge of them.

Illusion of transparency:
People overestimate others ability to know them, and they also overestimate their ability to know others. Self-serving bias: The tendency to claim more responsibility for successes than failures. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way beneficial to their interests.

Bias blind spot:


The tendency not to compensate for ones own cognitive biases.

Choice-supportive bias:
The tendency to remember ones choices as better than they actually were.

Confirmation bias:
The tendency to search for or interpret information in a way that confirms ones preconceptions.

Congruence bias:
The tendency to test hypotheses exclusively through direct testing, in contrast to tests of possible alternative hypotheses.

Contrast effect:
The enhancement or diminishment of a weight or other measurement when compared with recently observed contrasting object.

Deformation professionnelle:
The tendency to look at things according to the conventions of ones own profession, forgetting any broader point of view.

Endowment effect:
The fact that people often demand much more to give up an object than they would be willing to pay to acquire it.

Extreme aversion:
Most people will go to great lengths to avoid extremes. People are more likely to choose an option if it is the intermediate choice.

Focusing effect:
Prediction bias occurring when people place too much importance on one aspect of an event; causes error in accurately predicting the utility of a future outcome.

Framing:
By using a too narrow approach or description of the situation or issue.

Illusion of control:
The tendency for human beings to believe they can control or at least influence outcomes that they clearly cannot.

Impact bias:
The tendency for people to overestimate the length or the intensity of the impact of future feeling states.

Information bias:
The tendency to seek information even when it cannot affect action.

Irrational escalation:
The tendency to make irrational decisions based upon rational decisions in the past or to justify actions already taken.

Loss aversion:
The disutility of giving up an object is greater than the utility associated with acquiring it.

Neglect of probability:
The tendency to completely disregard probability when making a decision under uncertainty.

Mere exposure effect:


The tendency for people to express undue liking for things merely because they are familiar with them.

Omission bias:
The tendency to judge harmful actions as worse or less moral, than equally harmful omissions (inactions).

Planning fallacy:
The tendency to underestimate task-completion times.

Post-purchase rationalisation:
The tendency to persuade oneself through rational argument that a purchase was a good value.

Pseudo certainty effect:


The tendency to make risk-averse choices if the expected outcome is positive, but make riskseeking choices to avoid negative outcomes.

Selective perception:
The tendency for expectations to affect perception.

Status quo bias:


The tendency for people to like things to stay relatively the same.

Unit bias:
The tendency to want to finish a given unit of a task or an item with strong effects on the consumption of food in particular.

Von Restorff effect:


The tendency for an item that stands out like a sore thumb to be more likely to be remembered than other items.

Zero-risk bias:
Preference for reducing a small risk to zero over a greater reduction in a larger risk.

Biases in probability and belief:


Ambiguity effect:
The avoidance of options for which missing information makes the probability seem unknown.

Anchoring:
The tendency to rely too heavily, or anchor, on a past reference or on one trait or piece of information when making decisions.

Anthropic bias:
The tendency for ones evidence to be biased by observation selection effects.

Attentional bias:
Neglect of relevant data when making judgments of a correlation or association.

Availability heuristic:
A biased prediction, due to the tendency to focus on the most salient and emotionally-charged outcome.

Clustering illusion:
The tendency to see patterns where actually none exist.

Conjunction fallacy:
The tendency to assume that specific conditions are more probable than general ones.

Gamblers fallacy:
The tendency to assume that individual random events are influenced by previous random events. For example, Ive flipped heads with this coin five times consecutively, so the chance of tails coming out on the sixth flip is much greater than heads.

Illusory correlation:
Beliefs that inaccurately suppose a relationship between a certain type of action and an effect.

Ludic fallacy:
The analysis of chance related problems with the narrow frame of games. Ignoring the complexity of reality, and the non-Gaussian distribution of many things.

Neglect of prior base rates effect:


The tendency to fail to incorporate prior known probabilities which are pertinent to the decision at hand.

Observer-expectancy effect:
When a researcher expects a given result and therefore unconsciously manipulates an experiment or misinterprets data in order to find it.

Optimism bias:
The systematic tendency to be over-optimistic about the outcome of planned actions.

Overconfidence effect:
The tendency to overestimate ones own abilities.

Positive outcome bias:


A tendency in prediction to overestimate the probability of good things happening to them.

Primacy effect:
The tendency to weigh initial events more than subsequent events.

Recency effect:
The tendency to weigh recent events more than earlier events.

Reminiscence bump:
The effect that people tend to recall more personal events from adolescence and early adulthood than from other lifetime periods.

Rosy retrospection:
The tendency to rate past events more positively than they had actually rated them when the event occurred.

Subadditivity effect:
The tendency to judge probability of the whole to be less than the probabilities of the parts.

Telescoping effect:
The effect that recent events appear to have occurred more remotely and remote events appear to have occurred more recently.

Texas sharpshooter fallacy:


The fallacies of selecting or adjusting a hypothesis after the data are collected, making it impossible to test the hypothesis fairly.

Hawthorne effect:
Refers to a phenomenon which is thought to occur when people observed during a research study temporarily change their behaviour or performance.

List of Memory Biases:


Beneffectance:
Perceiving oneself as responsible for desirable outcomes but not responsible for undesirable ones.

Choice-supportive bias:
Remembering chosen options as having been better than rejected options.

Hindsight bias:
The inclination to see past events as being predictable; also called the I-knew-it-all-along effect.

Infantile amnesia:
The retention of few memories from before the age of two years.

Zeigarnik effect:
That uncompleted or interrupted tasks are remembered better than completed ones.

Von Restorff effect:


That an item that sticks out is more likely to be remembered than other items

Suffix effect:
The weakening of the Recency effect in the case that an item is appended to the list that the subject is not required to recall.

Suggestibility:
A form of misattribution where ideas suggested by a questioner are mistaken for memory.

Telescoping effect:
The tendency to displace recent events backward in time and remote events forward in time, so that recent events appear to be more remote, and remote events, more recent.

Primacy effect:
That the first items on a list show an advantage in memory.

Modality effect:
That memory recall is higher for the last items of a list when the list items were received via speech than when they were received via writing.

Mood congruent memory bias:


The improved recall of information congruent with ones current mood.

Detail and Examples of Bias: Bias:


Indication offered by the Federal Reserve as to whether it is likely to increase or decrease the federal funds interest rate during its next meeting. A negative bias indicates the Fed is likely to lower the federal funds rate.

1) Confirmation Bias:
The rational decision making process assumes that we objectively gather information. But we dont. We selectively gather information. The confirmation bias represents a specific case of selective perception. We seek out information that reaffirms our past choices, and we discount information that contradicts past judgments. We also tend to accept information at face value that confirms our preconceived views, while being critical and sceptical of information that challenges these views. Therefore, the information we gather is typically biased toward supporting views we already hold. This confirmation bias influences where we go to collect evidence because we tend to seek out places that are more likely to tell us what we want to hear. It also leads us to give too much weight to supporting information and too little to contradictory information. Consider a study conducted by Peter Cathcart Wason. In the study, Wason showed participants a triplet of numbers (2, 4, 6) and asked them to guess the rule for which the pattern followed. From that, participants could offer test triplets to see if their rule held. From this starting point, most participants picked specific rules such as "goes up by 2" or "1x, 2x, 3x." By only guessing triplets that fit their rule, they didn't realize the actual rule was "any three ascending numbers." A simple test triplet of "3, 15, 317" would have invalidated their theories.

2) Hindsight Bias:
Hindsight bias is a term used in psychology to explain the tendency of people to overestimate their ability to have predicted an outcome that could not possibly have been predicted. In essence, the hindsight bias is sort of like saying I knew it! when an outcome (either expected or unexpected) occurs and the belief that one actually predicted it correctly. 1. You are nervous to take an exam for which you waited to study until the very last minute. When you take the exam, you feel unsure about the results; however, when your grade comes back a B+, you exclaim to your friends, I was sure that Id aced that exam! and actually believe it in hindsight. 2. A snowy night, a police officer predicts that its the perfect condition for a teenage driver to get into a fender-bender. When the police scanner says that a driver who had just received her license skidded into a mailbox, the officer tells him that he had been certain such a thing would happen on that night, of all nights.

3) Over confident:
Overconfidence can cause a person to experience problems because he may not prepare properly for a situation or may get into a dangerous situation that he is not equipped to handle. 1. A person who thinks he has a photographic memory and a detailed understanding of a subject. The person could show his overconfidence by deciding not to study for a test that he has to take on the subject, thus doing poorly on the test due to lack of preparation. 2. A person who thinks he is invaluable to his employer when almost anyone could actually do his job. The person might show his overconfidence by coming in late to work because he thinks he is never going to get fired, or by being overly demanding about getting a raise and threatening to quit if he doesn't get his way.

4) Anchoring Bias:
Basing a judgment on a familiar reference point that is incomplete or irrelevant to the problem being solved. 1. A consumer judges the relative value of a product on the basis of the cost in some previous period. An investor judges a stock price as overvalued or undervalued based on the stock's previous high price. 2. A person looking to buy a used car - they may focus excessively on the odometer reading and the year of the car, and use those criteria as a basis for evaluating the value of the car, rather than considering how well the engine or the transmission is maintained.

5) Fundamental Attribution Error:


Mistaking personality and character traits for differences caused by situations. A classic study demonstrating this had participants rate speakers who were speaking for or against Fidel Castro. Even if the participants were told the position of the speaker was determined by a coin toss, they rated the attitudes of the speaker as being closer to the side they were forced to speak on. Studies have shown that it is difficult to out-think these cognitive biases. Even when

participants in different studies were warned about bias beforehand, this had little impact on their ability to see past them. What an understanding of biases can do is allow you to design decision making methods and procedures so that biases can be circumvented. Researchers use double-blind studies to prevent bias from contaminating results. Making adjustments to your decision making, problem solving and learning patterns you can try to reduce their effects.

6) Self-serving:
Self serving is a person or action done only for one's own benefit, sometimes at the expense of others. For example; a lie told to make yourself look better.

7) Representation :
Representation is the act of speaking on someone's behalf, or depicting or portraying something. 1. When a lawyer acts on behalf of a client 2. When you make a drawing of your mother that is meant to look like her.

8) Availability:
The definition of availability is whether someone or something can be accessed or used. For example; when a classmate can meet to discuss a project on a certain date.

9) Selective perception:
The tendency for expectations to affect perception.

For example; something selective is a college that accepts only the top candidates.

10) Immediate:
The definition of immediate is next in line or something that is occurring right now. An example of something that would be described as an immediate need is if you are starving and need food right this minute.

11) Forward bias:


Forward bias is a voltage that brings the transistor or tube into or closer to its conductive state. For example, if the gate requires positive voltage to conduct, forward biasing adds positive voltage.

12) Reverse Bias:


Reverse bias holds the device in a non-conductive state until the sum of the control voltage and bias is sufficient to bring it to the conductive state. For example, if the gate requires positive voltage to conduct, reverse biasing adds negative voltage.

13) Response bias:


The influence on the answer a respondent gives of what he or she believes the questioner wants to hear. In some cases a question may be poorly worded, so as to favour a particular response. For example: With the recent increases in the cost of living, are you satisfied with the candidates proposal to raise taxes?

14) Interviewer bias:


Intentional or unintentional partiality of an interviewer that affects the response of the person being interviewed. For example, the tone of an interviewer's voice or the look on the interviewer's face may influence a response.

15) Congruence bias:


The definition of congruence is agreement, compatibility or harmony. When two studies prove the same results, this is an example of congruence.

16) Impact Bias:


The definition of impact is one thing crashing into or having an effect on another. An example of impact is the affect that humans are having on the environment.

17) Omission bias:


Omission is defined as the act of leaving something out, or not sharing some piece of information or the thing that is left out. 1. An example of omission is when you neglect to mention how much your new outfit cost to your husband. 2. An example of omission is the price of the new dress that you didn't reveal.

18) Outcome:
The outcome is the final result of something, or the way things end up. When a team wins a game 2-1, this is an example of a winning outcome for the team.

19) Unit bias:


The definition of a unit is a fixed standard amount or a single person, group, thing or number. An example of a unit is a single apartment in an apartment building.

20) Optimism :
Optimism is the state of being cheerful or hopeful about the future and about the world around you. An example of optimism is when you are in a bad situation, but you remain cheerful and confident it will all turn out right in the end.

21) Egocentric :
The definition of egocentric is self-centered and is someone who thinks only about himself or who thinks the world revolves around him. An example of someone who would be described as egocentric is a selfish person who plans his dinner party menu around his favourites, not taking into consideration the needs of his guests.

22) Projection :
The definition of a projection is something that is shown on a screen at a distance, or something that sticks out, or a future estimate or prediction made from current information, or when you assume that someone has the same emotions or feelings that you do. 1. An example of a projection is a movie. 2. An example of a projection is a bar coming out of a wall.

3. An example of a projection is a bank guessing what future interest rates will be. 4. An example of projection is seeing a sad person while you're sad and assuming they are sad for the same reasons.

23) Trait ascription bias:


Trait is a characteristic you inherit or a distinguishing characteristic or feature. 1. An example of a trait is the tendency politicians have to exaggerate. 2. An example of a trait is blue eyes.

THANK YOU

S-ar putea să vă placă și