Diffusion Of Responsibility Is A Phenomenon English Language Essay

Published: Last Edited:

This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.

Do the following: Do emotion and our instinct to conform to society and authority affect our ability to think critically? View the following video on the Milgram Experiment: http://youtu.be/W147ybOdgpE

Why did about 50% of the subjects, normal everyday Americans, deliver lethal doses of electrical shocks (it was really only an actor pretending to be electrocuted)?

To be successful in this assignment, evaluate the following social biases (write one (1) page in Microsoft Word or WordPad):

Evaluate the Milgram experiment from the perspective of group pressure and conformity.

1. Using Chapter 4 of the textbook, describe of how group pressure and conformity affected the

outcome of the experiment. Your answer should be about two paragraphs (4-5 sentences each)

in length.

2. For each example, include at least one quote (citation) from the book that supports your


Evaluate the Milgram experiment from the perspective of diffusion of responsibility.

3. Using Chapter 4 of the textbook, describe how diffusion of responsibility affected the outcome of the experiment. Your answer should be about two paragraphs (4-5 sentences each) in length.

4. For each example, include at least one quote (citation) from the book that supports your


*Be sure to include citations from the textbook using the following format-

"Quote from the book" (Boss, 2010. Pg #)

Use MS Word or WordPad to complete your assignment.

Your teacher wants to know: how conformity affected the outcome of the experiment, with quotations from your text to back it up. Then they want you to explain how diffusion of responsibility affected the outcome, with more quotes from the text to back it up.

Diffusion of responsibility is a phenomenon that occurs when we take credit for our successes but blame others for our failures. Taking credit for our successes and blaming others for our failures is a type of self-serving bias. Diffusion of responsibility occurs in groups of people above a certain threshold, where responsibility is not explicitly assigned to particular individuals, and where people subsequently tend not to regard those responsibilities as their concerns, or conceive of those responsibilities as belonging to others.

While the specifics of whom we assign to the "out" group is learned, our brain seems to be wired to see the world in terms of "one of us/one of them." Group pressure and the urge to conform are so strong in humans that it can cause us to deny evidence that is right before our eyes. (Boss, 2010. Pg #120)

This the e-book In Chapter 4 we will:

Learn about the nature and limitations of human knowledge

Distinguish between rationalism and empiricism

Learn about different types of evidence

Set guidelines for evaluating evidence

Look at sources for researching claims and evidence

Study different types of cognitive/perceptual errors, including self-serving biases

Learn how social expectations and group pressure can lead to erroneous thinking

Finally, we will examine the evidence and arguments regarding unidentified flying objects (UFOs) and what type of proof would be necessary to establish their existence.

Group Pressure and Conformity

Group pressure can influence individual members to take positions that they would never support by themselves, as happened in the Stanford Prison experiment described in Chapter 1. Some religious cults exploit this tendency by separating their members from the dissenting views of family and friends. In many cults, people live together, eat together, and may even be assigned a buddy.

Group pressure is so powerful in shaping how we see the world that it can lead people to deny contrary evidence that is right before their eyes. In the 1950s, social psychologist Solomon Asch carried out a series of experiments in which he showed study subjects a screen containing a standard line on the left and three comparison lines on the right. One of the comparison lines was the same length as the standard line and the other two were of significantly different lengths.40 In each case, an unsuspecting study subject was introduced into a group with six confederates, who had been told by the experimenter to give the wrong answer. The group was then shown the lines. The experimenter asked one of the confederates which of the three lines on the right they thought was the same length as the standard line. The confederate, without hesitation, gave a wrong answer. The next few confederates gave the same answer. By now, the naïve subject was showing puzzlement and even dismay. How can six people be wrong?

Page 121

After hearing six "wrong" answers, 75 percent of the naïve study subjects, rather than trust the evidence of their senses, succumbed to group pressure and gave the same wrong answer. Even more surprising is the fact that when questioned afterward, some of these study subjects had actually come to believe the wrong answer was correct.

The desire for agreement is normal. However, this desire, when combined with our innate tendency to divide the world into "one of us" and "one of them," can lead to the exclusion of those who disagree with the majority, since people tend to prefer being around people who agree with them. In the corporate world, disagreement is often tacitly discouraged. "Outliers" or nonconformists who do not agree with group members may be excluded by committee chairs from further discussions or even fired.41

Because of our inborn tendency to conform to what others think, we cannot assume that agreement leads to truth without knowledge about the manner and conditions under which the agreement was arrived. Indeed, the current emphasis on seeking group consensus in decision making may be unreliable. In consensus seeking, the majority in a group is often able to sway the whole group to its view.



In Asch's experiment, the naïve subject (left) shows puzzlement when the other subjects give what is obviously a wrong answer.



What do you think the naïve subject in the picture above is thinking?

Think back to a time when you were in a similar situation where you thought you were correct, but everyone else with you thought something else. How did you respond to the discrepancy between your belief and theirs?


As with other errors in our thinking, we need to develop strategies to recognize and compensate for our human inclination to conform to groupthink. When a group comes to a decision, we need to mentally step back from the group and carefully evaluate the evidence for a particular position rather than assume that the majority must be correct. In competitive ice skating and diving, because of the danger of a judge's scoring being contaminated by what other judges say, scoring is done individually, rather than as a group decision.

Page 122

Diffusion of Responsibility

Diffusion of responsibility is a social phenomenon that occurs in groups of people above a critical size. If responsibility is not explicitly assigned to us, we tend to regard it as not our problem but as belonging to someone else. We are much more likely to come to someone's aid if we are alone than if we are in a crowd.

We are much more likely to come to someone's aid if we are alone than if we are in a crowd.

This phenomenon is also known as bystander apathy or the Kitty Genovese syndrome. In 1964, twenty-eight-year-old Kitty Genovese was murdered outside her New York City apartment building. Her killer left twice, when people in the building turned on their lights, before he came back a third time and killed her. In the half hour that lapsed during the attack, none of Genovese's thirty-eight neighbors, who had heard her repeated cries for help, called the police. More recently, in June 2008, an elderly man was struck by a hit-and-run driver on a busy street in Hartford, Connecticut. The man lay in the street paralyzed and bleeding from his head while bystanders gawked at or ignored him. Motorists drove around his body without stopping. No one offered any assistance until an ambulance finally turned up. Diffusion of responsibility can also occur in group hazing at fraternities where no one comes to the rescue of a pledge who is clearly in distress.

As social beings, we are vulnerable to the "one of us/one of them" error, social expectations, and group conformity. When in groups, we also tend to regard something as not our problem unless responsibility is assigned to us. Although these traits may promote group cohesiveness, they can interfere with effective critical thinking. As good critical thinkers we need to be aware of these tendencies, and to cultivate the ability to think independently while still taking into consideration others' perspectives. Errors in our thinking also make us more vulnerable to falling for or using fallacies in arguments. We'll be studying some of these fallacies in the following chapter.


The phenomenon of "diffusion of responsibility" was regrettably illustrated when no one came to the aid of a seriously injured man lying in a busy street in Hartford, Connecticut after being struck by a hit-and-run driver in May 2008. The victim, Angel Torres, later died from the injuries he sustained.

Page 123




Whom do you define as "us" and whom do you put in the category of "them"? Discuss how you might go about widening the "us" category to include more people who are now in your "them" category.


Humans seem to have inborn biases toward particular types of people. According to a University of Florida study, when it comes to hiring, employers have a more favorable view of tall people. When it comes to earnings, every extra inch of height above the norm is worth $789 a year. In fact, nine of ten top executives are taller than the typical employee.42 Given this cognitive error and its impact on hiring practices, discuss whether or not affirmative action policies should apply to very short people. Relate your answer to the discussion in the text of the effect of this cognitive error on our thinking.


Think of a time when your social expectations led you to misjudge a person or a situation. Discuss strategies for improving your critical-thinking skills so that this is less likely to happen.


Think of a time when the public got caught up in a "witch hunt." Identify the worldviews and social expectations that supported this "witch hunt." Which critical-thinking skills would make you less likely to go along with a "witch hunt"? Discuss what actions you could take to develop or strengthen these skills.


Polls before elections can influence how people vote by swaying undecided voters to vote for the candidate who is in the lead. Analyze whether election polls should be forbidden prior to the election itself.


The democratic process depends on social consensus. Given people's tendency to conform to social expectations and what others think, is democracy the best form of government? If so, what policies might be put in place to lessen the effect of social biases? Be specific.


Think of a time when you failed to speak out against an injustice or failed to come to someone's aid simply because you were in a large group and felt it wasn't your responsibility. Discuss ways in which improving your critical-thinking skills may make you less susceptible to the diffusion of social responsibility error.


Computers (AI) programmed with an inductive logic program can, after sufficient experience working with the ups and downs of the financial market, predict the market with greater accuracy than most experienced financial planners. Given that these computers are not as prone to cognitive errors as are humans, critically evaluate whether we should rely more on AI to make decisions about such issues as college admissions, medical diagnoses, matchmaking, and piloting an airplane.


What are some of the sources of knowledge?



Sources of knowledge include both reason and experience. Experience encompasses direct and indirect experience, expert testimony, and research resources such as printed material and the Internet.


In what ways might experience be misleading?



Experience can be distorted through false memories, confirmation bias, and reliance on hearsay and anecdotal evidence, as well as perceptual, cognitive, and social errors in our thinking.


What are some of the types of cognitive and social errors in our thinking?



Cognitive and social errors are in part the way our brain interprets the world. They include misperception of random data, memorable-events errors, probability errors, self-serving biases, self-fulfilling prophecies, one of us/one of them error, social expectations, group pressure and conformity, and diffusion of responsibility.


Why is it so many people obey when they feel coerced? Social psychologist Stanley Milgram researched the effect of authority on obedience. He concluded people obey either out of fear or out of a desire to appear cooperative--even when acting against their own better judgment and desires. Milgram's classic yet controversial experiment illustrates people's reluctance to confront those who abuse power. It is my opinion that Milgram's book should be required reading (see References below) for anyone in supervisory or management positions.

Milgram recruited subjects for his experiments from various walks in life. Respondents were told the experiment would study the effects of punishment on learning ability. They were offered a token cash award for participating. Although respondents thought they had an equal chance of playing the role of a student or of a teacher, the process was rigged so all respondents ended up playing the teacher. The learner was an actor working as a cohort of the experimenter.

"Teachers" were asked to administer increasingly severe electric shocks to the "learner" when questions were answered incorrectly. In reality, the only electric shocks delivered in the experiment were single 45-volt shock samples given to each teacher. This was done to give teachers a feeling for the jolts they thought they would be discharging.

Shock levels were labeled from 15 to 450 volts. Besides the numerical scale, verbal anchors added to the frightful appearance of the instrument. Beginning from the lower end, jolt levels were labeled: "slight shock," "moderate shock," "strong shock," "very strong shock," "intense shock," and "extreme intensity shock." The next two anchors were "Danger: Severe Shock," and, past that, a simple but ghastly "XXX."

In response to the supposed jolts, the "learner" (actor) would begin to grunt at 75 volts; complain at 120 volts; ask to be released at 150 volts; plead with increasing vigor, next; and let out agonized screams at 285 volts. Eventually, in desperation, the learner was to yell loudly and complain of heart pain.

At some point the actor would refuse to answer any more questions. Finally, at 330 volts the actor would be totally silent-that is, if any of the teacher participants got so far without rebelling first.

Teachers were instructed to treat silence as an incorrect answer and apply the next shock level to the student.

If at any point the innocent teacher hesitated to inflict the shocks, the experimenter would pressure him to proceed. Such demands would take the form of increasingly severe statements, such as "The experiment requires that you continue."

What do you think was the average voltage given by teachers before they refused to administer further shocks? What percentage of teachers, if any, do you think went up to the maximum voltage of 450?

Results from the experiment. Some teachers refused to continue with the shocks early on, despite urging from the experimenter. This is the type of response Milgram expected as the norm. But Milgram was shocked to find those who questioned authority were in the minority. Sixty-five percent (65%) of the teachers were willing to progress to the maximum voltage level.

Participants demonstrated a range of negative emotions about continuing. Some pleaded with the learner, asking the actor to answer questions carefully. Others started to laugh nervously and act strangely in diverse ways. Some subjects appeared cold, hopeless, somber, or arrogant. Some thought they had killed the learner. Nevertheless, participants continued to obey, discharging the full shock to learners. One man who wanted to abandon the experiment was told the experiment must continue. Instead of challenging the decision of the experimenter, he proceeded, repeating to himself, "It's got to go on, it's got to go on."

Milgram's experiment included a number of variations. In one, the learner was not only visible but teachers were asked to force the learner's hand to the shock plate so they could deliver the punishment. Less obedience was extracted from subjects in this case. In another variation, teachers were instructed to apply whatever voltage they desired to incorrect answers. Teachers averaged 83 volts, and only 2.5 percent of participants used the full 450 volts available. This shows most participants were good, average people, not evil individuals. They obeyed only under coercion.

In general, more submission was elicited from "teachers" when (1) the authority figure was in close proximity; (2) teachers felt they could pass on responsibility to others; and (3) experiments took place under the auspices of a respected organization.

Participants were debriefed after the experiment and showed much relief at finding they had not harmed the student. One cried from emotion when he saw the student alive, and explained that he thought he had killed him. But what was different about those who obeyed and those who rebelled? Milgram divided participants into three categories:

Obeyed but justified themselves. Some obedient participants gave up responsibility for their actions, blaming the experimenter. If anything had happened to the learner, they reasoned, it would have been the experimenter's fault. Others had transferred the blame to the learner: "He was so stupid and stubborn he deserved to be shocked."

Obeyed but blamed themselves. Others felt badly about what they had done and were quite harsh on themselves. Members of this group would, perhaps, be more likely to challenge authority if confronted with a similar situation in the future.

Rebelled. Finally, rebellious subjects questioned the authority of the experimenter and argued there was a greater ethical imperative calling for the protection of the learner over the needs of the experimenter. Some of these individuals felt they were accountable to a higher authority.

Why were those who challenged authority in the minority? So entrenched is obedience it may void personal codes of conduct.