Where I intend to be divisive is with respect to the argument that religion, and moral education more generally, represent the only — or perhaps even the ultimate — source of moral reasoning. If anything, moral education is often motivated by self-interest, to do what's best for those within a moral community, preaching singularity, not plurality. Blame nurture, not nature, for our moral atrocities against humanity. And blame educated partiality more generally, as this allows us to lump into one category all those who fail to acknowledge our shared humanity and fail to use secular reasoning to practise compassion.The second deals with the reasons of moral atrocities in the world:If religion is not the source of our moral insights — and moral education has the demonstrated potential to teach partiality and, therefore, morally destructive behaviour — then what other sources of inspiration are on offer?
One answer to this question is emerging from an unsuspected corner of academia: the mind sciences. Recent discoveries suggest that all humans, young and old, male and female, conservative and liberal, living in Sydney, San Francisco and Seoul, growing up as atheists, Buddhists, Catholics and Jews, with high school, university or professional degrees, are endowed with a gift from nature, a biological code for living a moral life.
This code, a universal moral grammar, provides us with an unconscious suite of principles for judging what is morally right and wrong. It is an impartial, rational and unemotional capacity. It doesn't dictate who we should help or who we are licensed to harm. Rather, it provides an abstract set of rules for how to intuitively understand when helping another is obligatory and when harming another is forbidden. And it does so dispassionately and impartially. What's the evidence?
To experience what subjects in some of our studies experience, see the moral sense test . It asks for information about gender, age, nationality, education, politics and religion. Once logged in, there is a series of scenarios asking participants to judge whether a particular action is morally forbidden, permissible or obligatory.
Most of the scenarios involve genuine moral dilemmas. All are unfamiliar, for a reason. Unfamiliar and artificial cases have an advantage over familiar scenarios, such as abortion, euthanasia and charitable donations: no one has a well-rehearsed and explicit moral argument for such cases, and for all the cases we create, neither the law nor religious scripture provides any guidance.
For example, if five people in a hospital each require an organ to survive, is it permissible for a doctor to take the organs of a healthy person who happens to walk by the hospital? Or if a lethal gas has leaked into the vent of a factory and is headed towards a room with seven people, is it permissible to push someone into the vent, preventing the gas from reaching the seven but killing the one? These are true moral dilemmas — challenging problems that push on our intuitions as lay jurists, forcing us to wrestle with the opposing forces of consequences (saving the lives of many) and rules (killing is wrong).
Based on the responses of thousands of participants to more than 100 dilemmas, we find no difference between men and women, young and old, theistic believers and non-believers, liberals and conservatives. When it comes to judging unfamiliar moral scenarios, your cultural background is virtually irrelevant.
If this code is universal and impartial, then why are there are so many moral atrocities in the world? The answer comes from thinking about our emotions, the feelings we recruit to fuel in-group favouritism, out-group hatred and, ultimately, dehumanisation.
...Here lies the answer to understanding the dangers of nurture, of education and partiality. When we fuel in-group biases by elevating and praising members of the group, we often unconsciously, and sometimes consciously, denigrate the "other" by feeding the most nefarious of all emotions, the dragon of disgust.
We label the other (the members of the out-group) with a description that makes them sub-human or even inanimate, often parasitic and vile, and thus disgusting. When disgust is recruited, those in the in-group have only one way out: purge the other.
But there is much room for flexibility:
The good news about the psychology of prejudice, of creating distinctive classes of individuals who are in the tribe and outside of it, is that it is flexible, capable of change and — viewed from an evolutionary perspective — as abstract and content-free as the rules that enter into our moral grammar.
All animals, humans included, have evolved the capacity to create a distinction between members of the in-group and those in the out-group. But the features that are selected are not set in the genome. Rather, it is open to experience.
For example, we know from studies of child development that within the first year of life, babies prefer to look at faces from their own race to faces of a different race, prefer to listen to speakers of their native language over foreigners, and even within their native language prefer to listen to their own dialect. But if babies watch someone of another race speaking their native language, they are much more willing to engage with this person than someone of the same race speaking a different language.
These social categories are created by experience, and some features are more important than others because they are harder to fake and more indicative of a shared cultural background. But, importantly, they are plastic. Racial discrimination is greatly reduced among children of mixed-racial parents. And adults who have dated individuals of another race are also much less prejudiced. On this note, moral education can play a more nurturing role by introducing all children, early in life, to the varieties of religions, political systems, languages, social organisations and races. Exposure to diversity is perhaps our best option for reducing, if not eradicating, strong out-group biases.
But he does back away from purely biological basis for morality and for good reasons:
Lest there be any confusion about the claims I am making, I am not saying that our evolved capacity to intuitively judge what is right or wrong is sufficient to live a moral life. It is most definitely not and for two good reasons.
For one, some of our moral instincts evolved during a period of human history that looked nothing like the situation today. In our distant past, we lived in small groups consisting of highly familiar and often familial individuals, with no formal laws. Today we live in a large and diffuse society, where our decisions have little-to-no impact on most people in our community but with laws to enforce those who deviate from expected norms. Further, we are confronted with moral decisions that are unfamiliar, including stem cells, abortion, organ transplants and life support. When we confront these novel situations, our evolved system is ill-equipped.
The second reason is that living a moral life requires us to be restless with our present moral norms, always challenging us to discover what might and ought to be. And here is where nurture can re-enter the conversation. We need education because we need a world in which people listen to the universal voice of their species, while stopping to wonder whether there are alternatives. And if there are alternatives, we need rational and reasonable people who will be vigilant of partiality and champions of plurality.
Read the full article here.
We do have moral intuitions, and they are usually good ones, but nothing beats a moral argument (philosophy). You might also be interested in the writing of Skyrms (great books) and Haidt (doi: 10.1007/s11211-007-0034-z, and TED 2008 Talk).
ReplyDeleteYou're absolutely correct and I'm familiar with Haidt's work and like his reasoning...
ReplyDeleteHauser argues that our moral grammar is inbuilt. Now, that's no doubt true but it also occurs to me that our morality is heavily situational. In other words, the innate moral reaction you get will depend on the state of the mind when you test it. So a fearful person will have a different morality than their secure twin.
ReplyDeleteThis is relevant to his concerns that our morality evolved in a very different environment, and so we have to apply a 'learned' layer on top in order to function in the modern world.
I wonder to what extent that is actually true. We know that the hostile response to out-groups is driven by fear. I think that no amount of philosophizing or education will remove that. But you can change this apparent 'programmed' moral response by reducing the perceived threat.