Sunteți pe pagina 1din 11

Culorile si nvatarea. Albul permite o mai buna concentrare. Rosul genereaza activism mintal si abundenta asociativa.

Galbenul stimuleaza si ntretine starea de vigilenta, sporeste capacitatea de concentrare a atentiei. Verdele faciliteaza imaginatia. Albastrul favorizeaza inhibitia si ncetinirea ritmului activitatii (P. Muresan).

Psihologia reprezinta studiul mintii umane si a proceselor mentale in relatie cu comportamentul uman si natura umana. Datorita subiectului pe care il trateaza, psihologia nu este considerata o stiinta empirica, desi psihologii fac experimente si isi publica rezultatele in reviste de specialitate. Unele dintre experimentele pe care psihologii le-au condus de-a lungul anilor au relevat anumite lucruri despre modul in care oamenii gandesc si se comporta, lucruri nu tocmai de dorit, dar care ne ajuta cel putin sa ne pastram o doza de modestie. Este si asta de apreciat.

1. Teoria identitatii sociale

Un experiment psihologic clasic, de natura sociala, intitulat Experimentul pesterii haiducilor a fost realizat cu ajutorul a doua grupuri formatie din baieti de cate 11 ani, intr-un parc national din Oklahoma, si a demonstrat cat de usor se adopta o identitate de grup exclusiva si cat de rapid poate degenera in ostilitate si rezistenta fata de straini. Cercetatorul Muzafer Sherif a condus o serie de 3 experimente. In primul, grupurile s-au unit impotriva unui dusman comun. In cel de-al doilea, grupurile sau unit impotriva cerectatorilor! Iar in al treilea experiment, cercetatorii au reusit sa intorca grupurile unul impotriva celuilalt. 2. Experimentul din inchisoarea Stanford : Puterea corupe. Dezgustatorul experiment care a sapat adancimile raului in natura umana a sfarsit spre a afecta principalii cercetatori, in aceasi masura cu subiectii. Psihologul Philip Zimbardo a impartit participantii in doua grupuri; unii etichetati cu prizonier, iar altii cu garda. S-a desfasurat intr-o simulare de inchisoare de la subsolul inchisorii din Stanford. Prizonierii au fost arestati, perchezitionati la sange, curatati de paduchi, rasi in cap si supusi la alte procedee degradante similare. Prizonierii s-au revoltat a doua zi, iar reactia garzilor a fost rapida si brutala. Dupa scurta vreme, cei aflati in detentie s-au aratat supusi si ascultatori, in timp ce

garzile si-au luat foarte in serios rolurile batjocorind si abuzand de fuctia lor. Acesta ar putea fi confirmarea stiintifica a ideii ca oamenii nutresc tendinte malefice. Experimentul ce fusese planuit sa dureze 14 zile, a fost intrerupt dupa numai 6 zile din cauza nivelul crescut de abuz.

De exemplu, una din teoriile care s-au verificat prin experiment (dei nu acesta a fost scopul principal) este teoria atribuirii Some of the experiments psychologists have conducted over the years reveal things about the way we humans think and behave that we might not want to embrace, but which can at least help keep us humble. That's something.

1. 'Lord of the Flies': Social Identity Theory

The Robbers Cave Experiment is a classic social psychology experiment conducted with two groups of 11-year old boys at a state park in Oklahoma, and demonstrates just how easily an exclusive group identity is adopted and how quickly the group can degenerate into prejudice and antagonism toward outsiders. Researcher Muzafer Sherif actually conducted a series of 3 experiments. In the first, the groups banded together to gang up on a common enemy. In the second, the groups banded together to gang up on the researchers! By the third and final experiment, the researchers managed to turn the groups on each other.

2. The Stanford Prison Experiment: Power Corrupts

This infamous experiment to plumb the depths of evil in human hearts ended up affecting its lead researcher as much as its subjects. Psychologist Philip Zimbardo divided his participants into two groups labeled "prisoners" and "guards." It was conducted in a mock-up prison in a Stanford University basement. The prisoners were subjected to arrest, strip search, de-lousing, head shaving and other abuses. The guards were given clubs. The prisoners rebelled on the second day, and the reaction of the guards was swift and brutal. Before long, the prisoners were behaving meekly and with blind obedience, while the guards fully embraced their roles by taunting and abusing their charges. This one might be scientific confirmation of the idea that humans harbor evil tendencies. The planned 14-day experiment was halted after only 6 days due to increasing levels of abuse.

3. Obedience to Authority: Human Capacity for Cruelty

In 1963 psychologist Stanley Milgram set out to test people's propensity to obey authority when ordered to hurt another person. The world was still wondering what happened in Germany during WW-2 that caused so much horror. Milgram's subjects were told they were to be the 'teachers' of a 'learner' (who was secretly in on the experiment). They were to deliver electric shocks to the 'learner' if he or she got an answer wrong. Worse, they were told to increase the shock if the 'learner' continued to get the answers wrong. Despite the screams and moans of pain from the unseen 'learner', the subjects continued to deliver ever more severe shocks if ordered to do so by the experimenter in the lab coat. They continued even when told they had rendered the 'learner' unconscious! The conclusion? Looks like we humans are quite easily able to set aside moral and ethical considerations when ordered by authority to violate them.

4. Conformity: Not Believing Your Lying Eyes

From social identity theory psychologists got a handle on group dynamics and prejudices, how natural it is for groups to elicit conformity among their members. In 1951 Solomon Asch set out to identify just how much individual judgment is affected by the group. In a test environment in which undergrads were asked to render a judgment after other subjects gave deliberately wrong answers, 50% of people gave the same wrong answer when their turn came. Only 25% of test subjects refused to be swayed by the false judgment of the others, while 5% always went with the crowd. The finding was that a third of people will ignore what they know to be true and go with a falsehood if they're in a group that insists on the falsehood being true. What else will people do under influence of the group?

5. Lying to Ourselves: Cognitive Dissonance

One might begin to suspect that people must be pretty good at either ignoring their own feelings, beliefs and desires, or flat out lying to themselves (and getting away with it). In a classic 1959 experiment psychologists designed an experiment with level upon level of deceit to see just how much a person will ignore their own experience, even to the point of helping to convince someone else of something they know is not true. The human capacity for sustaining cognitive dissonance has since been confirmed in many other well-designed experiments. This capacity is linked closely with our desire to join and fit in with a group, adjusting our own values and beliefs about things to align with those of others. Perhaps, knowing about these propensities, we can learn to avoid believing our own lies too much. A classic 1959 social psychology experiment demonstrates how and why we lie to ourselves. Understanding this experiment sheds a brilliant light on the dark world of our inner motivations. The ground-breaking social psychological experiment of Festinger and Carlsmith (1959) provides a central insight into the stories we tell ourselves about why we think and behave the way we do. The experiment is filled with ingenious deception so the best way to understand it is to imagine you are taking part. So sit back, relax and travel back. The time is 1959 and you are an undergraduate student at Stanford University...

As part of your course you agree to take part in an experiment on 'measures of performance'. You are told the experiment will take two hours. As you are required to act as an experimental subject for a certain number of hours in a year - this will be two more of them out of the way. Little do you know, the experiment will actually become a classic in social psychology. And what will seem to you like accidents by the experimenters are all part of a carefully controlled deception. For now though, you are innocent.

The set-up
Once in the lab you are told the experiment is about how your expectations affect the actual experience of a task. Apparently there are two groups and in the other group they have been given a particular expectation about the study. To instil the expectation subtly, the participants in the other groups are informally briefed by a student who has apparently just completed the task. In your group, though, you'll do the task with no expectations. Perhaps you wonder why you're being told all this, but nevertheless it makes it seem a bit more exciting now that you know some of the mechanics behind the experiment. So you settle down to the first task you are given, and quickly realise it is extremely boring. You are asked to move some spools around in a box for half an hour, then for the next half an hour you move pegs around a board. Frankly, watching paint dry would have been preferable. At the end of the tasks the experimenter thanks you for taking part, then tells you that many other people find the task pretty interesting. This is a little confusing - the task was very boring. Whatever. You let it pass.

Experimental slip-up
Then the experimenter looks a little embarrassed and starts to explain haltingly that there's been a cock-up. He says they need your help. The participant coming in after you is in the other condition they mentioned before you did the task - the condition in which they have an expectation before carrying out the task. This expectation is that the task is actually really interesting. Unfortunately the person who usually sets up their expectation hasn't turned up. So, they ask if you wouldn't mind doing it. Not only that but they offer to pay you $1. Because it's 1959 and you're a student this is not completely insignificant for only a few minutes work. And, they tell you that they can use you again in the future. It sounds like easy money so you agree to take part. This is great - what started out as a simple fulfilment of a course component has unearthed a little ready cash for you. You are quickly introduced to the next participant who is about to do the same task you just completed. As instructed you tell her that the task she's about to do is really interesting. She smiles, thanks you and disappears off into the test room. You feel a pang of regret for getting her hopes up. Then the experimenter returns, thanks you again, and once again tells you that many people enjoy the task and hopes you found it interesting.

Then you are ushered through to another room where you are interviewed about the experiment you've just done. One of the questions asks you about how interesting the task was that you were given to do. This makes you pause for a minute and think. Now it seems to you that the task wasn't as boring as you first thought. You start to see how even the repetitive movements of the spools and pegs had a certain symmetrical beauty. And it was all in the name of science after all. This was a worthwhile endeavour and you hope the experimenters get some interesting results out of it. The task still couldn't be classified as great fun, but perhaps it wasn't that bad. You figure that, on reflection, it wasn't as bad as you first thought. You rate it moderately interesting. After the experiment you go and talk to your friend who was also doing the experiment. Comparing notes you found that your experiences were almost identical except for one vital difference. She was offered way more than you to brief the next student: $20! This is when it first occurs to you that there's been some trickery at work here. You ask her about the task with the spools and pegs: "Oh," she replies. "That was sooooo boring, I gave it the lowest rating possible." "No," you insist. "It wasn't that bad. Actually when you think about it, it was pretty interesting." She looks at you incredulously. What the hell is going on?

Cognitive dissonance
What you've just experienced is the power of cognitive dissonance. Social psychologists studying cognitive dissonance are interested in the way we deal with two thoughts that contradict each other - and how we deal with this contradiction. In this case: you thought the task was boring to start off with then you were paid to tell someone else the task was interesting. But, you're not the kind of person to casually go around lying to people. So how can you resolve your view of yourself as an honest person with lying to the next participant? The amount of money you were paid hardly salves your conscience - it was nice but not that nice. Your mind resolves this conundrum by deciding that actually the study was pretty interesting after all. You are helped to this conclusion by the experimenter who tells you other people also thought the study was pretty interesting. Your friend, meanwhile, has no need of these mental machinations. She merely thinks to herself: I've been paid $20 to lie, that's a small fortune for a student like me, and more than justifies my fibbing. The task was boring and still is boring whatever the experimenter tells me.

A beautiful theory
Since this experiment numerous studies of cognitive dissonance have been carried out and the effect is well-established. Its beauty is that it explains so many of our everyday behaviours. Here are some examples provided by Morton Hunt in his classic work 'The Story of Psychology':
When trying to join a group, the harder they make the barriers to entry, the more you value your membership. To resolve the dissonance between the hoops you were forced to jump through, and the reality of what turns out to be a pretty average club, we convince ourselves the club is, in fact, fantastic. People will interpret the same information in radically different ways to support their own views of the world. When deciding our view on a contentious point, we conveniently forget what jars with our own theory and remember everything that fits. People quickly adjust their values to fit their behaviour, even when it is clearly immoral. Those stealing from their employer will claim that "Everyone does it" so they would be losing out if they didn't, or alternatively that "I'm underpaid so I deserve a little extra on the side."

Once you start to think about it, the list of situations in which people resolve cognitive dissonance through rationalisations becomes ever longer and longer. If you're honest with yourself, I'm sure you can think of many times when you've done it yourself. I know I can. Being aware of this can help us avoid falling foul of the most dangerous consequences of cognitive dissonance: believing our own lies. You can read Festinger and Carlsmith's entire report at Classics in the History of Psychology. Reference Festinger, L., & Carlsmith, J. (1959). Cognitive consequences of forced compliance. Journal of Abnormal Psychology, 58, 203-10.

6. Memory Manipulation: Do You Really Know What You Saw?

In 1974 researchers designed an experiment to test the reliability of memory, and whether it could be manipulated after the fact. 45 people watched a

film of a car accident. Nine of those people were then asked to estimate how fast the cars were going when they "hit." Four other groups were asked an almost identical question, but the word "hit" was replaced with the words "smashed," "collided," "bumped" and "contacted." Those whose questions included the word "smashed" estimated the cars were going 10 mph faster than those whose word was "contacted." A week later participants were asked about broken glass (indicative of more serious accident), and those whose trigger words were more forceful said they remembered broken glass even though the film had depicted none. Looks like something so subtle as a single descriptive word can manipulate memories of an event!

7. Magic Memory Number: 7

Psychologist George Miller wrote in 1956 that he was "persecuted" by the number 7, which kept intruding on his mind while contemplating data or reading journals. Sometimes it was slightly higher, sometimes slightly lower, but always it hovered around 7. Miller theorized that this 'magic' number represents the number of items we are able to hold in our short term memory at any given time. Plus or minus 2. More recent studies have demonstrated that people are able to 'group' items in short term memory - thereby being able to hold more individual items - yet even there the total if groupings are considered units, the number comes out to 7. Plus or minus 2. Maybe this is why human cultural belief systems historically considered the number 7 to be especially important to the gods!

8. Anatomy of Mass Panic: War of the Worlds

Orson Wells broadcast an adaptation of H.G. Wells' War of the Worlds on radio in 1938, causing panic in nearly 3 million of the 6 million people who listened to the broadcast. Princeton psychologists later interviewed 135 New Jersey residents about their reactions to the broadcast. A surprising number of frightened people never bothered to check out the validity of the broadcast, and some highly educated individuals believed it was true just because it was on the radio and thus "authoritative." We like to think we're more sophisticated today and wouldn't fall for such an obvious dramatization, but don't be too sure, Media manipulation of our emotions and desires is a regular art form these days. Just ask Madison Avenue!

9. The Bargaining Table: Threats Don't Work

Luckily, the behavior of individuals is both less deceptive and less violent than the behavioral 'norms' of groups. In the area of diplomacy among individuals and groups, people attempt to get concessions they want or need from others. Usually without having to give up too much in exchange. Researchers Morgan Deutsch and Robert Krauss tested two factors involved in the crafting of agreements between humans in 1962: communication and threats. This complicated economic experiment found that cooperative relationships between the bargainers are more beneficial to both parties than threats, either unilateral or bilateral. Not exactly a rousing endorsement of capitalistic winner-take-all competition, but in view of the

current economic situation perhaps the results of this experiment should be kept in mind as we craft a recovery!

10. Risky Behavior: Prospect Theory

Speaking of the economy, researchers Daniel Kahneman and Amos Tversky studied decisionmaking The status quo bias is a cognitive bias for the status quo; in other words, people tend not to change an established behavior unless the incentive to change is compelling. It should be distinguished from rational preference for the status quo per se due to, for example, information effects, which cannot explain all experimental results. The finding has been observed in many fields, including political science and economics. Nick Bostrom argues that status quo bias may play a large role in opposition to human enhancement.[1] Kahneman, Thaler and Knetsch created experiments that could produce this effect reliably. They attribute it to a combination of loss aversion and the endowment effect,(dotare,posesie) two ideas relevant to prospect theory.

Loss aversion
In economics and decision theory, loss aversion refers to people's tendency to strongly prefer avoiding losses to acquiring gains. Some studies suggest that losses are twice as powerful, psychologically, as gains. Loss aversion was first convincingly demonstrated by Amos Tversky and Daniel Kahneman. This leads to risk aversion when people evaluate a possible gain; since people prefer avoiding losses to making gains. This explains the curvilinear shape of the prospect theory utility graph in the positive domain. Conversely people strongly prefer risks that might possibly mitigate a loss (called risk seeking behavior). Loss aversion may also explain sunk cost effects. Loss aversion implies that one who loses $100 will lose more satisfaction than another person will gain satisfaction from a $100 windfall. In marketing, the use of trial periods and rebates try to take advantage of the buyer's tendency to value the good more after he incorporates it in the status quo. Note that whether a transaction is framed as a loss or as a gain is very important to this calculation: would you rather get a $5 discount, or avoid a $5 surcharge? The same change in price framed differently has a significant effect on consumer behavior. Though traditional economists consider this "endowment effect" and all other effects of loss aversion to be completely irrational, that is why it is so important to the fields of marketing and behavioral finance. The effect of loss aversion in a marketing setting was demonstrated in a study of

consumer reaction to price changes to insurance policies.[1] The study found price increases had twice the effect on customer switching, compared to price decreases.

Jerome S. Bruner and Cecile C. Goodman in risky situations and developed a theory
about it that garnered a Nobel Prize and has been used to develop predictive economic models and influence marketing campaigns. Turns out that it's all about framing. People behaved differently depending on how the situation was presented. If considered in terms of losses, people were more likely to take risks. They were less likely to take a risk of the situation was presented in terms of what they stood to gain. This seems strangely opposite of what we'd tend to guess, so it's something to bear in mind next time you're trying to bluff at the poker table.