Do What You’re Told: Why the Milgram experiment still haunts me

 George on the street
I have no question that I know myself better now than I did at age 19. But it’s what you don’t know about yourself that gets you in trouble. (click to enlarge)
If you know what the Milgram experiment was, odds are that you are middle-aged or older, with a BA in the liberal arts, or even, god forbid, psychology. Less likely is that the experiment remains in your memory among your collection of cautionary tales—and you may haul it out in conversations where the topic gets round to questions of obedience, authority and moral compasses. (Why don’t we ever hear about ethical compasses? Because ethics are rules and roadmaps created by authorities purportedly to help us find our way if we lose our moral compasses. For the difference see either diffen.com or wisegeek.org.

In the early 1960s, Stanley Milgram, a Yale psych professor, recruited subjects to take part in a learning experiment. The volunteer was to read strings of words to a person he thought was another volunteer, but who was, in fact, a member of the team. Each time the learner got the words wrong, the volunteer was supposed to push a button that would deliver an electric shock. The lab-coated director told the volunteer to keep raising the voltage if the learner couldn’t get the words right. And a majority of volunteers, even as they protested, followed the instructions of the director, raising the electric shocks often past the point where the learner was apparently crying out with pain. (The learner’s screams were actually tape recordings, to provide continuity from one session to another.)

“…His face was red. His eyes were bloodshot and staring straight into mine. “Don’t you ever ever to that to me again,” he shouted.”

Late last year the Journal of Social Issues published an edition reexamining Milgram’s work, what it may or may not have proved, and the controversy and questions about human nature left still unanswered in its wake. For more detail see Cari Romm’s recent article in the Atlantic.

I came across the Milgram experiment in 1968 in a psychology course in the fall of my sophmore year at Purdue University. I had flunked out as a freshman physics major, and had decided to try my luck as a psych major; a.k.a. in those days as Sex, Drugs and Rock ‘n’ Roll.

The My Lai massacre had happened half a year earlier. My professor was an earnest radical stuck in an egregiously conservative midwestern university

and determined to wake his students up to the dark realities of the world. The film he showed us on the Milgram experiment did the job. The message I took away, and which I believe was what the prof intended, was not so much that there are lots of morally bereft people in the world who will follow the orders of evil leaders, but that until each of us is put in that kind of a situation, we don’t really know how we will act.

As a cocksure 19 year-old, I was, of course, outraged and certain that I would have been one of the volunteers who walked out rather than dial the shocks higher and higher. But as the years went on, I began to wonder if that were true. It was the first time I realized that we face, if we are honest, moral choices throughout our lives which tell us not so much who we are, but where we are, and serve to help us calibrate our compasses.

Moral compasses: Most of us have one that is more or less functional, and if we are living anything close to a right life, our moral compass gets better as we get older. And we get better at reading it.

You know you’ve got one if somebody with authority over you tells you to do something and you just don’t feel OK with it. You are a researcher at an investment ratings firm, and your boss tells you to change up the numbers. Or you are a reporter and your editor tells you to cut some quotes that could damage the mayor’s favorite development project. Or you are a young German soldier and your commanding officer tells you to round up all the Jews in an apartment building and take them to the railroad siding.

I’ll confess—and this is the reason the Milgram experiment haunts me—I was born with the most rudimentary of moral compasses. I had two ways of dealing with authority as a kid—obey it obsequiously, or lie and dissemble and run and hide from it.

My mother favored ethical structures over moral guidelines, and so was not able to give me much practice in developing my own vestigial sense of morality. I’m not blaming her. She had her reasons for needing to live her life according to a invariable set of rules, but telling a child what to do at every turn is not the same as helping a child to make choices and to learn by experience from the results of those choices.

I believe that a lucky few are born with a well-developed sense of morality, of right and wrong. My wife Joan and I saw suggestions of it early on in our son Morgan, but I got the proof one evening when he was two years old. Out of frustration at something he was doing, or not doing, in direct conflict with what I wanted him to do, or not do, I decided to try to impress him with the gravity of the situation by giving him a couple of token spanks on his diapered butt. He jumped out of my lap, picked up a short 2×4 left over from a building project and raised it over his head. His face was red. His eyes were bloodshot and staring straight into mine. “Don’t you ever ever to that to me again,” he shouted. I followed his advice.

Implicit in the imperative “Do what I tell you” is the threat, “…or else”. Or else, what? And that’s the bogeyman behind all credible authority. In the dynamics of authoritarian relationships, the threat often doesn’t even have to be stated. Fudge the numbers for your boss or you’ll get fired. But the results of opposing authority don’t have to be so clear. From the beginnings of learning language as babies, we also learn about authority and the consequences of opposing it. We soak it into our little bones.

The “Do what I say” from parents gradually morphs into the “Do what you are told” of society at large, at which point you may, or may not, realize you’ve internalized both the demand and the implied threat.

That’s why Milgram’s lab director didn’t have to say, “Or else”. The majority of volunteers who went beyond the reasonable level on the voltage machine probably made up their own consequences, what the “Or else” was to them, without even realizing it. Imagine, by the way, how you’d feel if you’d been one of the volunteers who ran the needle all the way up, and then had to confront what you’d done after the ruse was revealed to you. (One of the controversial aspects of Milgram’s methodology was that many volunteers apparently were not debriefed. It would pretty much be impossible to duplicate the experiments now, because the ethics, the rules, governing the use of human subjects has changed.)

So the experiment haunts me because I do not know how far I would have gone. I am confident that if I were in the experiment now, at age 65, I’d quit easily when it felt wrong. But I must admit that I do not know my 19-year-old self any better now than I did then. Why does that matter to me?

It matters because late at night, as I lie sleepless sometimes, I wonder what self-knowledge I’m missing even now. What adjustments to my moral compass have I been making, should I be making, or have I failed to make which will come to haunt me 20 years from now? What forces of authority are active in my life now, pressures I may not even be aware of yet am influenced by in ways that won’t be clear for years? What questions am I not asking, what assumptions am I making based on invisible biases, what corners am I too lazy or too busy or so deep in a rut to look around for a different perspective. How might I still be using, or abusing, my own authority?

What authority embedded in my brainstem may still be whispering so quietly I can’t even hear it, “Do what you are told, or else.”

-END-

To comment, email me at geopacko “at” gmail dot com