Think of someone you have a minor difference of opinion with. These disagreements might be a matter of personal preference, with no clear right or wrong answer. For example, what’s your favourite book? Which Jonas brother is best?
Other disagreements have an obviously wrong answer. Such as, 1 + 1 = 3, or the choice to put pineapple on pizza.
Whatever the disagreement, we all have them. Overlooking these minor differences is key to forging a successful relationship. We agree to disagree.
However, some disagreements aren’t so easy to overlook. These might be disagreements over things we hold dear, over our deepest beliefs. Often these are disagreements, even conflicts, over our moral values—and it’s a struggle to just agree to disagree.
The opposing values of the “other side” don’t just upset us. They anger us. They might even disgust us. “How could anyone be so immoral?” we ask. “Isn’t the moral side—my side—so obviously in the right?”
Ironically, this idea—that the “correct” or “moral” position is so blatantly clear that the people who disagree with us must just be bad people—is held by others across the religious, political, and social spectrums. Most people think their morality is the right one, and most people also think that the correct morality is obvious.
But clearly we can’t all be right. How do we know who is being moral? And more to the point, how does each person seem to know intuitively what is right and wrong?
For most people, their religion tells them what’s right and wrong. Whether it comes in the form of a holy book, or the idea that God has written what is right or wrong on our hearts, the connection for many people is both clear and decisive. Morality comes from God, case closed.
We may have some trouble relying on this view for settling disagreements about morality, though, as there are a lot of religions in the world with a lot of holy books, and even more interpretations of those books.
“Well, sure, there are many religions,” some may response. “But mine happens to be the right one.”
Soon after someone makes that claim, everyone else follows suit and we’re right back where we started. Plus, if a god has imbued humanity with morality, then he/she/it seems to have done so with a surprising amount of variation across many cultures… and also appears to have at least partially done the same for several primate species who show signs of having their own moral codes, including altruism, reciprocity, and prosocial concern.
The last three years in particular have brought our political, religious, and moral differences into stark relief. Small disagreements, especially about right and wrong, seem bigger than ever, and an increasing number of families find themselves divided.
With the winter holidays on the horizon and the prospect of once again sitting with family and friends with whom we might not share the same moral values, it seems prudent to take some time to better understand where our moral intuition comes from, why our moral intuitions seem so different from person to person, and maybe even how to disagree in a healthier way. Before we start throwing Christmas decorations at each other.
Where do our moral intuitions come from? As mentioned earlier, many primate species show signs of rudimentary moral systems. They share, help each other, and care for each other when they suffer. More importantly, they also keep track of members of their group who don’t behave in moral ways, and sometimes punish them for their transgressions. We’re written before about the origins of altruism as a product of evolution.
Simply put, acting in helpful and caring ways towards members of our group helps us all to survive better. Therefore, those who behave in altruistic ways—in moral ways—are more likely to pass on their genes to future generations.
When it comes to why our morals seems so intuitive to us, we can look to an explanation.
First proposed by social psychologist Dr. Jonathan Haidt, Moral Foundations Theory is built on research showing that human beings rarely stop and think about what is right and wrong; they simply “feel” it. In Haidt’s words, “Intuitions come first, strategic reasoning second.” This means that we feel these moral intuitions automatically, and only try to explain why we feel that way after the fact.
Haidt’s research presented ordinary people with morally ambiguous stories and tested their reactions. Typically, people had an initial reaction as to the morality of the situation and then attempted to create an explanation for this reaction afterwards. When researchers added information to the story to make the test subjects’ justification irrelevant, the subjects tended to maintain their moral judgement, even when they could no longer offer a rational explanation for doing so.
Most significantly, this was true for everyone they tested, no matter what their religious, political, or cultural background. There was no one moral intuitive framework shared by all “good” people.
So why do our moral intuitions seem to differ so greatly across people and cultures? Moral Foundations Theory can account for this as well, as it’s based on evolutionary theory.
According to Haidt’s research, people tended to base their moral intuitions on five main foundations that provided survival benefits: care, which helps us improve our group wellbeing and reduce suffering; fairness, which helps us cooperate for mutual benefit; loyalty, which helps us form groups that can stick together; authority, which helps us create and maintain a group structure; and purity, which helps us avoid disease or other harms.
These foundations are sometimes described differently, and occasionally a sixth foundation—“liberty” or “freedom”—is included, but in essence these foundations run like programs in the back of our minds, automatically assessing situations and informing us whether they are good or bad. Only after this occurs do we stop and try to justify those feelings, to explain them to others and ourselves.
Haidt’s research also showed that we each assign different levels of importance to different foundations when it comes to making moral determinations, and this is informed by our cultures, religions, or worldviews.
For instance, when it comes to politics, researchers have found that people who are politically left-leaning are more likely to base their moral determinations on the foundations of care or fairness; by contrast, right-leaning individuals are more likely to rely on the foundations of authority, loyalty, and purity.
For each group, then, the other side can seem like they are acting “without morals,” because they’re using different foundations to make their moral judgments.
It’s of critical importance to note that none of foundations are “more right.” They all serve their purpose in different ways. While an emphasis on the moral foundation of purity is theorized to be a source of bigotry in our world, this foundation also helped our ancestors to scrutinize food or other possible sources of illness to ensure better health. This is supported by a recent study showing that individuals with more conservative religious or political views are less likely to buy “misshapen” vegetables at the grocery store, even when those vegetables are discounted. The other foundations have their advantages and drawbacks as well.
By understanding that we all make moral decisions, even when we might not understand what we’re basing them on, we have a much better chance of seeking understanding without judgment. This can form the basis for healthier disagreements, rational discourse about morality, and maybe even the possibility of changing our minds.
We’ll continue this discussion next month, with an examination of how to navigate disagreements with friends and family.