left-panel-lighter-heading
    9-11-explosive-evidence-experts-speak-out

    Architects and Engineers - Solving the Mystery of WTC 7)

    RT TV Show Interviews AE911Truth Experts About ReThink911 Campaign


    Ben Swann, formerly of Cincinnati's FOX19, questions the official story of the collapse of the World Trade Center High-rises

    Ben Swann, formerly of Cincinnati's FOX19, has to admit that World Trade Center Building 7 probably did not collapse due to normal office fires as NIST would want us to believe


    Architects and Engineers - Solving the Mystery of WTC 7

    Architects & Engineers:
    Solving the Mystery of WTC 7
    A 15-min Documentary with Ed Asner


    9/11: Explosive Evidence – Experts Speak Out (4-minute trailer)

    9/11: Explosive Evidence -
    Experts Speak Out - Trailer
    Duration: 4:09


    9/11: Explosive Evidence – Experts Speak Out (58 minute free version)

    9/11: Explosive Evidence
    Experts Speak Out
    Free 1-hour version


    FOX TV, Fresno, with Richard Gage, AIA

    FOX TV, Fresno,
    with Richard Gage, AIA


    CBC the fifth estate unofficial story

    CBC - The Fifth Estate
    "The Unofficial Story"


    The Reality Report with Richard Gage

    The Reality Report
    with Richard Gage, AIA


    CCTV, with Richard Gage, AIA

    CCTV,
    with Richard Gage, AIA


    FOX News with Richard Gage, AIA

    FOX News
    with Richard Gage, AIA


    Vancouver Omni TV,
    with Richard Gage, AIA


    Richard Gage Live on TV3 - The Masterplan Event

    Richard Gage Live on TV3 - The Masterplan Event


    Read it at AE911Truth.org
    Why Do Good People Become Silent—or Worse—About 9/11? Print E-mail
    Written by Frances T. Shure   
    Tuesday, 21 January 2014 02:40

    Part 3: Obeying and Believing Authority 

    © by Frances T. Shure, 2014

    911-experts-shureEditor’s Note: Frances Shure, M.A., L.P.C., has performed an in-depth analysis addressing a key issue of our time: “Why Do Good People Become Silent—or Worse—About 9/11?” The resulting essay, to be presented here as a series, is a synthesis of reports on academic research as well as clinical observations.

    In answering the question in the title of this essay, last month’s segment, Part 2, addressed the anthropological study, Diffusion of Innovations, which discusses how change occurs in societies. These anthropologists discovered that, within diverse cultures, there can be found groups that vary in their openness to new ideas and technology—groups that fall within a neat bell curve. The success of the spread of an innovative technology or new idea reliably hinges on one point: whether or not opinion leaders vouch for it. In this context, the mainstream media can rightly be seen as promoting the official myth of 9/11, and therefore aiding and abetting the crimes of September 11, 2001.

    We continue Ms. Shure’s analysis in Part 3 with the authority experiments of Stanley Milgram, Jane Elliott, and Philip Zimbardo.

    In his famous 1961 experiment on obedience to authority, Yale University psychologist Stanley Milgram set out to answer the question, “Could it be that Eichmann and his million accomplices in the Holocaust were just following orders? Could we call them all accomplices?”

    Three people made up each of Milgram’s experiments: an experimenter (the authority); the subject of the experiment, a volunteer who was told that he or she was a “teacher”; and a confederate (a plant) who was thought by the subject to be a “student” or “learner,” but who was actually an actor.

    The “teacher” (subject) was given a sample electrical shock that the “student” (actor) would supposedly receive. Then, the teacher read a list of word pairs to the student, and the student would press a button to give his answer. If the response was correct, the teacher would go to the next list of word pairs, but if the answer was wrong, the teacher would administer an electric shock to the student. This would continue with shocks increasing in 15-volt increments for each succeeding incorrect answer. In reality, no electric shocks were actually administered, but pre-recorded sounds of pain would play at certain shock levels. At a higher level of the supposed shocks, the actor would bang on the wall separating him and the teacher and complain of his heart condition. At an even higher shock level, all sounds from the student ceased.

    Whenever a teacher would want to stop the experiment, the authority had a predetermined set of verbal prods, given in this order:

    1. Please continue.

    2. The experiment requires that you continue.

    3. It is absolutely essential that you continue.

    4. You have no other choice. You must go on.

    If, after the fourth prod, the subject still indicated a desire to stop, the experiment was halted. Otherwise, it was terminated only after the subject delivered what he or she thought was the maximum 450-volt shock three times in succession.

    milgram study diagramThe setup of the Milgram experimentSurprising the participants of a previous poll, given to Yale senior-year psychology students, Milgram’s colleagues, and some local psychiatrists, who all predicted that a very small fraction of the subjects would administer the maximum shock, Milgram found that approximately two-thirds of his subjects would willingly administer what they thought was the maximum, potentially lethal, 450-volt shock to a student, although many were very uncomfortable doing so.

    In his article, “The Perils of Obedience,” Milgram summarized the results of his groundbreaking study:

    Stark authority was pitted against the subjects’ strongest moral imperatives against hurting others, and, with the subjects’ ears ringing with the screams of the victims, authority won more often than not. The extreme willingness of adults to go to almost any lengths on the command of an authority constitutes the chief finding of the study and the fact most urgently demanding explanation.2

    A modified version of this experiment delivered some good news to those of us confronting the lies and abuses of authorities:

    In one variation, three teachers (two actors and a real subject) administered a test and shocks. When the two actors disobeyed the experimenter and refused to go beyond a certain shock level, thirty-six of forty subjects joined their disobedient peers and refused as well.3

    The lesson for 9/11 skeptics is not difficult to grasp: If we continue pushing through our own taboo barriers and the resistance of others and confidently speak our truth with solid information, our peers throughout the world will gradually join us.

    Nevertheless, the data from the original Milgram experiments can still “shock” us, as it did the world in the 1960s. For me, as an undergraduate student, hearing the fact that two-thirds of us would deliver a potentially lethal shock to a helpless and ill person was disturbing and life changing: Having been reared by fairly authoritarian parents, I knew that it was likely I would also have followed those orders! I resolved, therefore, to never blindly follow authority, but instead to listen to and trust my own inner guidance and conscience.

    But do these findings apply to firmly believing what an authority tells us? We might suspect that sometimes we follow the orders of an authority, but we do not always deeply believe what this authority proclaims (e.g., 19 Muslims attacked us because they hate our freedoms). Empirical evidence suggests that, yes, these findings do apply, especially if our fear has intensified and we respect that particular authority (e.g., George W. Bush or Barack Obama).

    jane elliottThird-grade teacher Jane ElliottAn astonishing social experiment by third-grade teacher Jane Elliott demonstrates the power of our human proclivity to believe a trusted authority—and even to develop our identity based on what this authority tells us about ourselves. Following the assassination of Martin Luther King, Jr., Elliott wanted to help her all-white third-graders in a small town in Iowa to understand prejudice. One day she told them:

    Today, the blue-eyed people will be on the bottom and the brown-eyed people on the top. What I mean is that brown-eyed people are better than blue-eyed people. They are cleaner than blue-eyed people. They are more civilized than blue-eyed people. And they are smarter than blue-eyed people.

    Brown-eyed people were allowed longer recess, the use of the bigger playground equipment, and to be first in line for lunch and second helpings. Elliott instructed the blue-eyed people to not play with brown-eyed people, unless asked, and to sit in the back of the room. Each brown-eyed child was given a collar to put around the neck of a blue-eyed child. Throughout the day, the teacher reinforced that brown-eyed children were superior and blue-eyed children were inferior.

    By lunchtime, the behavior alone of the children revealed whether they had brown or blue eyes:

    The brown-eyed children were happy, alert, having the time of their lives. And they were doing far better work than they had ever done before. The blue-eyed children were miserable. Their posture, their expressions, their entire attitudes were those of defeat. Their classroom work regressed sharply from that of the day before. Inside of an hour or so, they looked and acted as if they were, in fact, inferior. It was shocking.

    But even more frightening was the way the brown-eyed children turned on their friends of the day before….4

    Jane Elliott reversed the experiment the next day, labeling the blue-eyed children as superior, and the same thing happened in reverse.

    At the end of the day, she told her students that this was only an experiment and there was no innate difference between blue-eyed and brown-eyed people. The children took off their collars and hugged one another, looking immensely relieved to be equals and friends again. An interesting aspect of the experiment is how it affected learning…. Once the children realized that their power to learn depended on their belief in themselves, they held on to believing they were smart and didn’t let go of it again.5

    But surely, adults would be immune to such social pressure and manipulation, right? Wouldn’t adults be able to discern and resist what children cannot? Surely, as adults, our very identity would not be affected by such manipulation, would it?

    In a study strikingly similar to third-grade teacher Jane Elliott’s, social psychologist Philip Zimbardo’s famous Stanford Prison Experiment in the early 1970s proves this understandable assumption largely wrong.

    Zimbardo and colleagues used 24 male college students as subjects, dividing them arbitrarily into “guards” and “inmates” within a mock prison. Zimbardo instructed the “guards” to act oppressively toward the “prisoners,” therefore assuming the role of an authority.

    All students knew this was an experiment, but surprising even the experimenters, they nevertheless quickly internalized their roles as brutal, sadistic guards or emotionally broken prisoners. The “prison system” set up by the experimenters and the subsequent dynamic that developed had such a deleterious effect on the subjects that the study was terminated on the sixth day. However, this did not happen until graduate psychology student Christina Maslach—whom Philip Zimbardo was dating and who subsequently became his wife—brought to his attention the unethical conditions of the experiment.6

    prison experimentPhotos of subjects in Stanford Prison ExperimentThis experiment—as did the Milgram and Elliott studies—demonstrates the human tendency to believe and follow authority. The Zimbardo and Elliott studies demonstrate that our very identities are affected by what authority tells us and that peer pressure powerfully reinforces these human tendencies. As a result, Milgram’s subjects, Elliott’s third graders, and Zimbardo’s adult students committed atrocities, even in violation of cherished moral values.

    Zimbardo became an expert defense witness at the court-martial of one of the night-shift guards, Ivan “Chip” Frederick, of the infamous “Abu Ghraib Seven.” Because of his experience with the Stanford Prison Experiment, Zimbardo argued that it was the situation that had brought out the aberrant behaviors in otherwise good people. While the military argued that that these guards were a few “bad apples” in an otherwise good U.S. Army barrel, Zimbardo argued that these guards were normal, good apples in a very, very bad barrel.

    Chip Frederick pleaded guilty and received a sentence of eight years in prison, with Zimbardo’s testimony having little effect on the sentence he received. The other guards, found guilty, received sentences ranging from zero to ten years; the discrepancy in sentences seemed to make no sense.

    What is the truth? Were these night-shift guards only a few “bad apples” in a good barrel, or was the barrel itself contaminated? The Army itself stated that, since October 2001, there were more than 600 accusations of abuse of detainees. Many more went unreported, including abuse of “ghost detainees,” those unfortunate souls who, under the control of the CIA, were never identified and were often “rendered” to torture states. Many of these victims were essentially “disappeared.” By extension, there were obviously many “ghost abusers” who were never held accountable.

    To support his accusation that the barrel, rather than the apples, was toxic, in The Lucifer Effect, Zimbardo puts the system itself on trial. He finds that the orders, the expectations, and the pressure to torture came from the very top of the chain of command, and his analyses find guilty Secretary of Defense Donald Rumsfeld, CIA Director George Tenet, Lieutenant General Ricardo Sanchez, Major General Geoffrey Miller, Vice President Dick Cheney, and President George W. Bush.

    Zimbardo’s detailed analyses conclude that “this barrel of apples began rotting from the top down.” Yet he also praises the many heroes, the whistle-blowers from the bottom to the top of the military hierarchy, those human beings who risked their lives and careers to stand up and to stand strong against the toxic system.7

    Why do some people conform to the expectations of the system while others find the courage to remain true to their principles? Throughout this essay there are pointers toward these answers from the perspective of developmental and depth psychology, but to explore this immensely important subject in detail would require a separate work. Zimbardo, however, begins this exploration from a social psychologist’s perspective, declaring that we are all “heroes in waiting” and offers suggestions on how to resist undesirable social influences.8

    It is my firm belief that 9/11 skeptics—and true skeptics of any paradigm-shifting and taboo subject—who publicly expose lies and naked emperors are heroes who have come out of waiting, for we have suffered the ridicule and wrath of those emperors, their minions, and the just plain frightened.

    These three studies—Milgram’s study on obedience to authority, Elliott’s Blue Eyes/Brown Eyes Exercise, and Zimbardo’s Stanford Prison Experiment—demonstrate our human proclivity to trust and obey authority. Another question arises for us: Is this predisposition encoded genetically? Evidence appears to support this.

    To survive as babies and young children, we automatically look to our parents for confirmation of safety or danger.

    Chimpanzees, with whom our genetics match at least 94%, generally have one or more alpha male leaders, albeit often chosen by the females of the troop.10 Bonobos, with a genome close to that of the chimpanzees and thus to humans, have a matriarchal system with a female leader.11

    And, of course, human communities have leaders. Thus, the need for a leader, an authority, appears to be genetically hardwired. If we have been reared in an authoritarian family and school system, then this tendency to rely on authority figures for confirmation of reality is likely reinforced. Conversely, if we are reared in a family, school system, and cultural context that rewards critical thinking and respects our feelings and needs, then the tendency to rely on authority figures would likely be weakened.

    In our American society, many of our officials routinely lie to and abuse us, but nonetheless, many citizens continue to look to them for truth and safety—especially when fear is heightened. This strong tendency to believe and obey authority is another obstacle with which skeptics of the official 9/11 account must contend.

    By unquestioningly believing and obeying authority, we develop and perpetuate faulty identities and faulty beliefs, and to top it off, we make very bad decisions, which often negatively affect others. This can be equally true for the next four human proclivities studied by social psychologists: doublethink, cognitive dissonance, conformity, and groupthink.

    Editor’s note: To be continued in our next newsletter with Part 4: George Orwell’s brilliant observation of “Doublethink.”

     


    1 Stanley Milgram, Obedience to Authority: An Experimental View (Harper & Row Publishers, Inc., 1974).

    2 Stanley Milgram, “The Perils of Obedience,” Harpers Magazine (1974). Can be accessed http://www.age-of-the-sage.org/psychology/milgram_perils_authority_1974.html.

    3 Ibid.

    4 Dennis Linn, Sheila Fabricant Linn, and Mathew Linn, Healing the Future: Personal Recovery from Societal Wounding (Paulist Press, 2012) 56–60. William Peters, A Class Divided: Then and Now, expanded ed. (New Haven: Yale University Press, 1971); this book includes an account of Jane Elliott conducting a similar experiment for adult employees of the Iowa Department of Corrections. Documentary films that also tell this story are The Eye of the Storm, ABC News, 1970, distributed in DVD format by Admire Productions, 2004, http://www.admireentertainment.com, and A Class Divided, by Yale University Films, 1986, presented on Frontline and distributed in DVD format by PBS Home Video, www.pbs.org; both programs include study guides for use with groups.

    5 Dennis, Sheila, and Matthew Linn, Healing the Future, 57–58.

    6 Philip Zimbardo, The Lucifer Effect: Understanding How Good People Turn Evil (Random House Trade Paperbacks, 2008). Also, see http://www.simplypsychology.org/zimbardo.html.

    7 Zimbardo, The Lucifer Effect, 324−443.

    8 Ibid, 444−488.

    9 http://www.scientificamerican.com/article.cfm?id=human-chimp-gene-gap-wide

    10 http://en.wikipedia.org/wiki/Chimpanzee

    11 http://en.wikipedia.org/wiki/Bonobo3.

    Continued with Part 4: Doublethink