Recent ethical failings at Cricket Australia, the banks, and the Murray Darling basin authority have left many of us wondering how all these bad people got into positions of power and what must be done to punish them and deter others from doing the same.
But research indicates that only 4% of the population is ‘bad’ and that the majority of ‘bad’ unethical outcomes are unintentionally created by ‘good’ people. Why do good people create bad outcomes, and what can we do to avoid these pitfalls ourselves?
Charging dead people bank fees, tampering with the ball in cricket, excessively sedating old people in aged care homes – viewed in isolation these things are confronting and cause us to immediately look for the ‘bad’ person to blame. But is this true? Are all bad things done by bad people? Social psychology research (Stout, 2005) shows that only about 4% of the population could be considered ‘bad’ – that is, habitually acting in an amoral and antisocial way. And although other research (Babiak & Hare, 2006) indicates that the corporate world may in fact attract those with sociopathic tendencies, it’s a stretch to blame all bad things on bad people. So what’s going on? How do good people end up unintentionally creating bad outcomes?
The field of moral development has been dominated by Piaget’s (1932) work with children. In translating this work to adults and the business world some key assumptions (Kohlberg, 1976) were made about how people make ethical decisions. First and foremost was the assumption that ethical reasoning is a rational, logical process, and that bad things are done by people lacking in moral development – either because of bad values, bad character, or greed. Hence, we assume that ethical reasoning can be taught and that if a bad thing occurs then we must weed out the bad apples (the greedy ones) and re-educate the others with character and values training.
Ethics training makes no difference
The only problem is that there is a large body of work stretching over 30 years (Craft, 2013; O’Fallon & Butterfield, 2005) that indicates that ethics training makes no difference – and that’s just in case you’ve had your head in the sand and have missed the recent Royal Inquiry into banking. Other research (Cima, Tonnaer, & Hauser, 2010) shows that psychopaths have the same capacity for ethical reasoning as the rest of us – they just don’t care.
More social psychology research indicates the bizarre effects of situational factors and circumstance on human behaviour. For example, situational factors have been found to have a counter-intuitive effect on behaviour. Passivity in the face of threats was studied by Darley and Latina (1968) who conducted research to understand the situational forces that freeze the pro-social actions of ordinary people in New York. The researchers found that the more people who witness an emergency, the less likely any of them will intervene – this is called the bystander effect.
The effects of anonymity
Cultural anthropologist Watson Jr. (1973) studied cultures around the world using data from the Human Relations Area files. He studied 23 societies and found that in 15 of them the warriors changed their appearance. Of these 80% of them brutalised their enemies (12 of 15) as compared to the societies where warriors did not change their appearance. In these countries, seven out of eight of them did not engage in such behaviour. Social psychologist Philip Zimbardo (2007, p. 25), the architect of the infamous Stanford Prison Experiment, proposes that, “Any setting that cloaks people in anonymity reduces their sense of personal accountability and civic responsibility for their actions.”
Other examples that illustrate this process of de-individuation include military uniforms such as the SS in World War II and modern police riot equipment that disguises individuals. These factors are important in our modern online world where anonymity is easy.
We also have issues with obedience to authority
In a well known experiment – which would never be allowed today – Stanley Milgram (1974) instructed students to administer increasingly severe electric shocks to a test subject if they got a wrong answer. Despite screams from the test subjects (who were acting) in Milgram’s first set of experiments, 65% (26 of 40) of participants were willing to administer what they believed was a 450 volt shock.
This research shows how situational factors can outweigh character with regard to behaviour. De-humanisation facilitated by labels, stereotypes, slogans, and propaganda images also facilitates the suppression, inhibition, and distortion of emotions associated with unethical behaviour. Albert Bandura (1999), another social psychologist, calls this process moral disengagement.
New York gangs studied
How does this happen though? It’s easy to see youth delinquents as anti-social, and to assume that they hold anti-social values, especially when confronted with gang-like behaviour. But research (Sykes & Matza, 1957) carried out way back in the 1950s in New York found this to be untrue. The researchers instead found that delinquent youth held the same values as mainstream society but they engaged in a series of flawed justifications that allowed them to do bad things without having to reassess themselves as being bad. These justifications included: –
The denial of responsibility
The key here is that the individual sees his or her action as ‘unintentional’ and that they are therefore not responsible due to forces beyond their control. For example, poor upbringing, unloving parents, or ‘just following orders’. ‘It’s not my fault’ is the catch-cry.
The denial of injury
The distinction here is that the act is seen as wrong but not immoral. ‘It’s not hurting anyone’ is the common justification. An example is graffiti.
The denial of the victim
Denial neutralises the rights of the victim so that in some way the circumstances justified the act and hence the perpetrator may even be cast as the ‘avenger’. The story of Robin Hood robbing the rich to give to the poor is the classic example where the justification is ‘they deserved it’.
The condemnation of the condemners
Claims of unfairness and hypocrisy are key here with motives being questioned. Police are corrupt, teachers unfair, parents take out their issues on their kids. The wrongfulness of the act is repressed. ‘You think I’m bad but you should see them’ would be the cry.
The appeal to higher loyalties
Societal norms are rejected owing to higher loyalties, for example to family, gang members, etc. The extreme example of this would be bikie gangs or street gangs and their ‘codes’. ‘Live by the code of brotherhood’ would be an example.
More recent research
(Heath, 2008) has added two more:
Everyone else is doing it
The key here is that the perpetrator claims they have no choice. This is particularly prevalent in competitive situations, such as doping in elite sport, where the justification would be ‘everyone else was doing it so I had no choice other than to follow suit’.
Claim to entitlement
Entitlement is a justification based on rights or karma: ‘I did this so therefore I deserve that’. An example might be ‘I have worked back for the last five days straight so I deserve to use the company credit card to buy myself and my family dinner’.
Put all this together and you can start to see how good people can unintentionally create bad outcomes. These flawed justifications work by neutralising our values and protecting our moral self-identify. For example, most people believe in the value of safe driving but many of us have broken the speed limit, especially late at night with no one else on the road. Our justification may be that it’s not hurting anyone and it’s a stupid rule anyway. In this way we can do a bad thing without having to reassess our self-identify as being bad.
Similarly, we can be caught up in bad behaviour when everyone else around us is doing the same thing – this is called ‘group think’ – and the justification is that everybody else is doing it. Watch the Banking Royal Commission testimonies if you’d like to see some examples.
I’m smarter than that
You may be thinking, “Yes but that’s not me, I’m smarter than that.” Well, think again. In fact the research (Aquino & Reed, 2002) shows that the more attached you are to your moral self-identity the more likely you are to engage in flawed justifications.
In addition, we think we are more ethical than others (Tenbrunsel, Diekman, Wade-Benzoni, & Bazeman, 2007), we re-write history in our own favour – and perceptual blindness (Chugh & Bazerman, 2007) may prevent us from even seeing the ethical dilemma in the first place. For example, former Australian Cricket Captain Steve Smith, former NAB Chairman Ken Henry, and former NAB CEO Andrew Thorburn…
So, what now?
How do we deal with all this? Values training won’t work, character is arguably already set, and we are clearly all suffering from a range of cognitive biases and blindness.
A good starting point is intention. What are you actually trying to create from a moral perspective? This means an outcome bound by intrinsic values such as honesty, honour, justice, and fairness. One of the contributing factors in ethical blindness is using the wrong decision-making criteria, for example using business decision making criteria (read profit maximisation) instead of ethical decision making criteria.
Lawyers or scientists?
Next, be aware of how the rational mind can act more like a lawyer defending a client rather than an impartial scientist examining the facts. This means that we can engage our rational mind to justify why we did something after the event, using one of those seven flawed justifications above, instead of thinking of what we want to do, and how this is bound by our values – before the event.
It is here also that our hard wiring works against us, because from a survival perspective we are cognitive misers. This means that we try to avoid using energy by pattern-matching against the past, instead of assessing all of the facts in the moment. The result of this is that ethical outcomes don’t happen suddenly, but rather slowly over time in a cumulative way. A fix here is to be listening for any of the seven flawed justifications and to develop mindfulness and present moment awareness.
Thirdly, remember that you are not so smart
If you want something really badly, or if you have been incentivised by organisations dangling carrots such as bonuses, then you may be blind to the ethical aspect of a problem. One way to get around this is to engage friends and trusted advisors to wear the ‘white hat’. Ask them to be the ethics person and ask the ethical questions:
- How does this support your values and principles?
- Is it in the best interests of all concerned?
- Does it uphold the golden rule of doing unto others as you would wish them to do unto you?
We are faced now with an online, interconnected world that encourages anonymity and dehumanisation. Where injustices such as wealth inequity and cyber bullying can prompt the best of us to do bad things. Now, more than ever, is the time to reconnect with our values. We need to understand the flawed nature of being human and build community. Remember that a rising tide floats all boats.
[author title=”About the author”]
Aquino, K., & Reed, A., II. (2002).
The self-importance of moral identity. Journal of Personality and Social Psychology, 83(6), 1423-1440. doi:10.1037//0022-35184.108.40.2063
Babiak, P., & Hare, R. D. (2006).
Snakes in Suits: When Psychopaths go to Work. New York: Regan Books.
Bandura, A. (1999).
Moral disengagment in the perpetration of inhumanities. Personality and Social Psychology Review, 3(3), 193-209.
Chugh, D., & Bazerman, M. H. (2007).
Bounded Awareness: What You Fail to See Can Hurt You. Mind & Society, 6(1), 1-18.
Cima, M., Tonnaer, F., & Hauser, M. (2010).
Psychopaths know right from wrong but don’t care. Social Cognitive and Affective Neuroscience, 5(1), 59-67.
Craft, J. L. (2013).
A Review of the Empirical Ethical Decision-Making Literature: 2004–2011. Journal of Business Ethics, 117(2), 221-259. doi:10.1007/s10551-012-1518-9
Darley , J. M., & Latane, B. (1968).
Bystander Intervention in Emergencies: Diffusion of Responsibilities. Journal of Personality and Social Psychology, 8(4, Pt. 1), 377-383.
Heath, J. (2008).
Business Ethics and Moral Motivation: A Criminological Perspective. Journal of Business Ethics, 83(4), 595-614. doi:10.1007/s10551-007-9641-8
Kohlberg, L. (1976).
Moral stages and moralization: The cognitive-developmental approach. In T. Lickona (Ed.), Moral Development and Behavior: Theory, Research and Social Issues. Holt, NY: Rinehart and Winston.
Milgram, S. (1974).
Obediance to Authority: An Experimental View. New York: Harper & Row.
O’Fallon, M. J., & Butterfield, K. D. (2005).
A Review of The Empirical Ethical Decision-Making Literature: 1996–2003. Journal of Business Ethics, 59(4), 375-413. doi:10.1007/s10551-005-2929-7
Piaget, J. (1932).
The moral judgement of the child (M. Gabain, Trans.). New York: Free Press.
Stout, M. (2005).
The Sociopath Next Door. New York: Broadway Books.
Sykes, G., & Matza, D. (1957).
Techniques of Neutralization: A Theory of Delinquency. American Sociological Review, 22(6), 664-670.
Tenbrunsel, A. E., Diekman, K. A., Wade-Benzoni, K. A., & Bazeman, M. H. (2007).
Why We Aren’t as Ethical as We Think We Are: A Temporal Explanation. Harvard University. Boston, MA.
Watson Jr., R. J. (1973).
Investigation into Deindividuation Using a Cross-Cultural Survey Technique. Journal of Personality and Social Psychology, 25(3), 342-345.
Zimbardo, P. (2007).
The Lucifer Effect – How good people turn evil. UK: Rider.[share title="Share this post" facebook="true" twitter="true" google_plus="true" linkedin="true" email="true"]