Why Do People Commit To Behavior Which Consistently Lacks Moral or Ethical Principles
Charlie Anderson, Prevent Disease
Waking Times
Call it what you like…thirst for control, money, power, greed, lust or simply a lack of conscience. Does good behavior lead to more good behavior and by the same token does bad behavior lead to more of the bad? The answer depends on our ethical mindset, according to new research published in Psychological Science.
What makes a person feel justified in taking advantage over another? Psychological scientist Gert Cornelissen of the Universitat Pompeu Fabra and colleagues found that people who have an “ends justify the means” mindset are more likely to balance their good and bad deeds, while those who believe that what is right and wrong is a matter of principle are more likely to be consistent in their behavior, even if that behavior is bad.
Previous research from the Stanford Graduate School of Business, found that providing a sense of power to someone instills a black-and-white sense of right and wrong (especially wrong). Once armed with this moral clarity, powerful people then perceive wrongdoing with much less ambiguity than people lacking this power, and punish apparent wrong-doers with more severity than people without power would.
Existing research is mixed when it comes to explaining how previous behavior affects our current moral conduct.
Some researchers find evidence for moral balancing, suggesting that we hover around a moral setpoint. Going over that setpoint by doing a good deed gives us license to engage in more self-interested, immoral, or antisocial behavior. When our moral self-image falls below that setpoint, however, we feel ill at ease and try to compensate by engaging in positive behavior.
What decides whether you will sell out your personal values to the highest bidder? It can be reasonably suggested that there would be no corruption in the world if people refused to sacrifice their value systems for monetary compensation. So why does it happen?
A neuro-imaging study shows that personal values that people refuse to disavow, even when offered cash to do so, are processed differently in the brain than those values that are willingly sold.
The brain imaging data showed a strong correlation between sacred values and activation of the neural systems associated with evaluating rights and wrongs (the left temporoparietal junction) and semantic rule retrieval (the left ventrolateral prefrontal cortex), but not with systems associated with reward.
Investigators have previously found that particular emotional centers in the brain charged up when the dilemmas involved people in clear and present danger. But the brain activity was diminished in moral decisions that did not involve “up close and personal”‘ harm to others–such as deciding whether to keep money found in a lost wallet.
Other researchers have argued for behavioral consistency, suggesting that engaging in an ethical or unethical act leads to more of the same behavior.
Cornelissen and colleagues explored what facilitates either phenomenon in a series of three studies.
The results from all three studies showed that participants’ dominant ethical mindset, in combination with their previous behavior, influenced their behavior in the lab.
When given a pot of money to divide, people with an outcome-based mindset allocated fewer coins to their partners after recalling recent ethical behavior. They were also more likely to cheat when given the opportunity to self-report the number of test items they answered correctly. These results suggest that they felt licensed to engage in “bad” behavior after thinking about their good deeds.
People who had a rule-based mindset, on the other hand, gave more coins to their partner and were less likely to cheat after recalling an ethical act, indicating that they were trying to be consistent with their previous behavior.
The relationship seems to be driven, at least in part, by the fact that people with an outcome-based mindset are attending to their moral self-image, or the discrepancy between the self they perceive and the self they aspire to be.
The theoretical framework explored in these studies — integrating ethical mindsets and moral dynamics — helps to reconcile seemingly conflicting strands of research.
Cornelissen and colleagues believe that this research deals with a fundamental mechanism that could help us to understand patterns of moral behavior for people in any kind of role, such as consumers, managers, employees, neighbors, or citizens.
It may also help to explain cases in which individuals are consistently unethical.
Making moral judgments requires at least two processes — a logical assessment of the intention, and an emotional reaction to it. When rules are established, violation of those rules can often lead to the wrong assessments leading to negative emotions.
“In the current studies, we showed that a rule-based mindset can lead to a consistent pattern of unethical behavior, in which violating a rule becomes the norm. Such a pattern resembles the slippery slope of moral decision making,” write Cornelissen and colleagues.
According to the researchers, additional research may help us better understand the mechanisms that underlie this behavior and find ways to prevent individuals from descending down the slippery slope.
About the Author
Charlie Anderson holds degrees in Psychology and Public Health Sciences. He currently consults for various institutions and agencies in Canada and the United States.
~~ Help Waking Times to raise the vibration by sharing this article with the buttons below…