What Brain Science Reveals about Ethical Decline and Moral GrowthNEWS | 14 October 2025It started with an innocent mistake. Texas entrepreneur Chris Bentley had founded a company to buy drilling rights for oil and gas. He realized that a batch of letters he’d sent to landowners, offering to lease their rights, had incorrect information, including monetary amounts and other details.
But instead of correcting the errors, Bentley doubled down, not wanting to admit his mistake. When the letters failed to secure enough land leases to generate big profits, Bentley tried to make up the difference by sinking his investors’ money into new, risky deals, some of which faltered and drained the coffers of his company, Bellatorum Resources. Then, as the company’s cash flow dried up, Bentley started putting bogus transactions on the books to keep his employees paid. He didn’t stop until he’d committed $40 million worth of fraud. “I basically did the age-old ‘rob Peter to pay Paul,’” says Bentley, who was recently released from prison into home confinement. “Everything started going downhill.”
Moral death spirals such as Bentley’s happen in every sphere of public life, from business to local government to the highest levels of political leadership. The deterioration often begins with a small dishonest act—such as Bentley’s decision to bluff his way through what had been an honest error—and mounts until it reaches a point of no return. Some escalating crimes are financial; others progress toward human rights violations or worse.
On supporting science journalism
If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.
Brain and psychology researchers are delving into how slides down the moral slope begin and what keeps them going. Initially we may be horrified at the thought of lying, cheating or hurting someone. But as we engage in wrongdoing over and over, our brains tend to grow numb to it. It’s harder to embezzle or kill for the first time than it is for the tenth.
Yet moral snowballing can also happen in the opposite direction. Surprisingly, just as neural habituation can drive ethical collapse, it can also drive escalating spirals of virtue, in which one honest or brave action makes the next one easier to carry out. And because our brains adapt to repeated behaviors, movement in a given moral direction can persist—making it all the more critical to pinpoint where and how that movement begins.
When we first become aware of an ethical violation, we’re hardwired to react much as we might to a steaming cow pie.
Carrying out acts of moral courage, such as dodging roadside bombs to get supplies to Iraqi civilians while in the U.S. Army, helped former Capitol police officer Aquilino Gonell to stand strong during the January 6 insurgency. That harrowing experience, which left him with severe injuries, also gave him the resolve to speak out about what rioters had done to him and others, although he knew telling the truth could put him in extremists’ crosshairs. “I couldn’t live with myself remaining silent,” he told me.
It’s easier for people to act morally when they embrace bottom-line values that they’ll uphold no matter what. Gonell’s grandfather would remind him, “Never tell lies”—a principle that stayed with him. And once people choose to follow their conscience, they often find that the emotional rewards outweigh the hazards. Those who listen to their better angels not only escape the self-reproach that comes from avoiding what they feel is right action, they may even find deep purpose and joy in aligning their actions with their value system—as Gonell did in speaking out. “The more I did it, I used that as mental health therapy,” he says. “I can live with myself knowing that I have met the moment in time and now history.”
When we first become aware of an ethical violation—say, a co-worker’s embezzlement scheme or a secret inner desire to hurt someone—we’re hardwired to react much as we might to a steaming cow pie. In a 2020 study by researchers in Switzerland, people who’d just thought about an ethically thorny situation reacted more intensely to rank smells than control participants did. Areas of their brains that processed physical disgust, such as the anterior insula, were also more active, hinting that the moral violation hit them like a whiff of manure. “Disgust and moral disgust are uniquely connected,” says neuroscientist Gil Sharvit, the study’s lead author.
Still, if neuroscientists untangling the complex processes that govern moral decisions have reached any overriding conclusion, it is that no single brain circuit dominates such choices. In scans using functional magnetic resonance imaging (fMRI), a wide network of brain areas activate when people reach major decision points, reflecting the broad range of social, emotional and instinctual factors that weigh into each moral choice.
Along with the automatic recoil, the brain’s fear-processing amygdala activates as people consider the risks they run by doing something wrong. Contemplating an ethical stand may also evoke fear—of retribution. As reflection continues, however, moral decision-making evolves into an inner debate in which logic tempers the quick initial responses. Multiple areas of the brain, including the prefrontal cortex, a general decision center, help to regulate instinctive reactions such as fear and disgust, putting them into a larger context. The anterior cingulate cortex, along with the anterior insula and nucleus accumbens, assesses the net reward or penalty a morally fraught decision will incur and manages emotions tied to the decision, making it feel more palatable—or not. This synthesis can make the way forward seem clearer.
Although these basic neural networks are similar from person to person, factors such as someone’s personal history, what feels rewarding to them and what’s happening around them can profoundly alter someone’s mental processing and subsequent moral response. When Bentley reflects on what kicked off his bogus transaction scheme, he keeps coming back to his ravenous appetite for risk—one he honed during his service with the U.S. Marines in Afghanistan, where members followed the creed of “improvise, adapt and overcome” at any cost.
In the field, Bentley was responsible for getting necessary battle supplies to teams in far-flung locations. He once went off script by enlisting a team of Afghan interpreters to drive out in a pickup truck to drop off gear for U.S. soldiers. “If they would’ve stolen it and never come back, which I trusted them not to do, it would’ve been my ass,” Bentley says. “But I saw it as the only option to get the teams what they needed.” The mission’s success cemented Bentley’s belief that audacious risks confer outsize rewards.
When people develop a slot puller’s zest for risk and personal gain, that acquired swagger affects not just what they’re willing to put on the line but what kinds of moral choices they make. In a study published in 2024 by researchers in India, people who’d grown used to risky gambling games proved more willing to make moral choices others might find loathsome, such as (theoretically!) pushing one person in front of a speeding trolley to save others. This result showed that instead of relying on absolute moral rules to guide their behavior, such as “Never actively kill someone,” risk-tolerant gamblers tended to make moral decisions based on more utilitarian cost-benefit calculations. What Bentley hoped to gain through his scheme at Bellatorum—recognition, profits, a chance to give other veterans opportunities—loomed larger in his mind than any absolute moral value.
Just as we adapt to lingering stenches, we seem to adapt to initial wrongdoing in ways that prompt us to go further.
As he weighed whether to go ethically rogue, Bentley says, he also felt under the gun. However people might describe their highest values in moments of calm, those values are prone to precipitous collapse under pressure. As a scrappy small-business owner, Bentley felt immense pressure to deliver on his clients’ expectations, and he didn’t see any room for error. “My fund didn’t allow for losses,” he says. “We literally had a zero-mistake structure.” He and his team had worked late nights for a week to prepare and deliver 5,000 offer letters to landowners, and when he discovered those letters were defective, he was so horrified at the thought of backpedaling that he scrambled to cover up his mistake. “I was definitely in panic mode,” he says.
“When we feel afraid, our bodies are thinking we’re in a life-death situation,” says ethical consultant Brooke Deterline, founder of the Courageous Leadership consulting firm. In this frenzied state, the body floods with stress hormones such as cortisol, which are known to interfere with higher cognitive functioning. Cognitive shutdown may help explain why people who are told to hurry because they’re running late, for instance, assist those in need less often than those who aren’t feeling pressured. The Socratic axiom “To know the good is to do the good” can break down in the heat of the moment.
At least initially, when people lie, steal or hurt someone, they often seethe with self-disgust. The cow-pie stench is coming from inside the house, and its presence is intolerable. The first time that former WorldCom employee Betty Vinson made a multimillion-dollar accounting adjustment to inflate the company’s profits, she felt such dread that she approached her bosses and told them she was resigning.
But just as we adapt to lingering stenches, we seem to adjust to initial wrongdoing in ways that prompt us to go further. In an Arizona State University experiment in which 73 college students solved math problems, participants could earn a small amount of cash for each correct answer, but they also had chances to take more than they’d earned from an envelope. When people’s opportunities to steal started off small (just a few cents) and grew ever larger, twice as many people stole from the envelope as did people who stood to gain the same amount every turn by cheating.
Organizational psychologist David Welsh, the paper’s lead author, wasn’t surprised by the results. He’d done the study in part because he couldn’t get Stanley Milgram’s work out of his mind. In that classic experiment, participants dubbed “teachers” were told to give electric shocks to “students” who answered questions wrong.
Milgram’s most talked-about finding was how often people obeyed corrupt orders. But what struck Welsh was the moral habituation that appeared to be taking place. “They started out instructing the participants to deliver these very small shocks,” he says, “and then the shocks got larger and larger.” If “teachers” expressed doubt about what they were doing, experiment leaders urged them to continue with phrases such as “You have no other choice; you must go on.” With such moral coercion easing their complicity, people who’d never have dreamed of zapping anyone with 450 volts became all too willing to comply when they worked up to that number gradually.
Gradual moral adaptation occurred even in the lower-stakes scenario Welsh set up, where only cash, not people’s health, was at stake. An initial, small transgression seemed to embolden participants to commit a bigger one the next time. As soon as people start telling themselves it’s not a big deal to massage the numbers on their balance sheet or to take credit for someone else’s work, conditions are ripe for a slippery-slope moral descent, Welsh says. “Once they’re in that mindset of rationalizing their bad behavior, it becomes that much easier to do it again and again and again.”
Researchers at University College London have described one biological basis for this habituation. While in an fMRI scanner, study participants played a game in which they could enrich themselves by deceiving others. The more people lied to other players, the more exaggerated their lies were likely to be the next time around. These habitual liars also showed reduced activation in the brain’s amygdala, which is involved in emotional arousal—and the lower their amygdala activation, the more flagrant their lies were in the next round of the game. The researchers believe gradual neural adaptation is at play: the more times people lie, the less emotionally distressing lying feels, which allows for increasing comfort in dangerous moral waters.
Vinson fell prey to this effect as she got drawn into WorldCom’s multibillion-dollar corruption scheme. Although she wanted to resign after her first fraudulent transaction, her boss talked her out of it, telling her she wouldn’t be asked to do anything else untoward. So she stayed on, and when executives asked her to perform another bogus transaction, she debated leaving again but decided not to. Soon, Vinson’s transactions became regular quarterly tasks, as routine as starting the coffee maker, even though they were staggering in size—up to $941 million.
What might have eased Vinson’s adjustment to grand-scale fraud was the number of people around her who seemed to be fine with it. Peer pressure warps reasoning skills in predictable ways. In psychologist Solomon Asch’s classic experiments, some participants consistently reported that two lines on a card were the same length when others in the room insisted this was the case. It didn’t seem to matter that one line was clearly longer than the other.
In some groups, threats from the top amplify members’ willingness to abandon their values. The energy company Enron dismissed employees who were exposing or questioning its suspect financial practices. Once this corrupt conformity takes hold, those who state the truth become outliers, as superfluous as runts of the litter—and as vulnerable to being left behind.
As Bellatorum’s CEO, Bentley never felt anyone was forcing him into an ethical corner. And although his fraudulent transactions became routine, he says he never really grew numb to what he was doing. “I was personally deteriorating,” he says. “I was drinking so much to self-medicate for living a lie.” What stopped Bentley from admitting his crimes—which, on one level, he desperately wanted to do—was that he’d convinced himself his wholesale fraud was the lesser of two evils. The way he saw it, his choices were these: confess and shut Bellatorum down, devastating employees and investors who’d trusted him, or continue his money-funneling scheme so he could write paychecks to his employees, many of whom were retired combat veterans.
Our brain’s propensity for habituation implies that the early stages of a moral trajectory may be the most crucial.
“I couldn’t bring myself to just shut down a business and let it fail after I had brought in so many people from around the country,” he says. Lose-lose choices like this can prompt intense distress and inner wrestling. In a 2016 study led by Natalie Claes, then at the University of Leuven, participants deciding between two bad options took longer to choose than those who had at least one good option, and they also reported feeling more fear during the process.
Physician Catherine Caldicott, who runs medical training programs in Florida, often encounters doctors caught in “lesser of two evils” binds. If they’re asked to list past criminal convictions when applying for or renewing a license to practice, they may tell themselves that lying is better than getting their application denied and being unable to help patients. When people reframe immoral or complicit acts as noble, they’re prone to go down the moral slippery slope, in part because they’ve locked onto the narrow idea that they can contribute more by going against broader values and professional principles. “They do not realize that there may be other choices available or more morally defensible ways forward,” Caldicott says. “Their ability to think rationally is impeded.”
Although initial wrongdoing can escalate over time, the converse is also true. When people respond bravely in fraught situations, courage becomes progressively easier as the brain continues to adapt to rising discomfort.
A study by researchers in Israel demonstrated this adaptation in a dramatic way. Members of the study’s experimental group, all of whom were afraid of snakes, entered an MRI scanner room where a five-foot-long corn snake was curled up just outside the scanner on a platform on a conveyor belt. Researchers told them their job was to get as close as possible to the snake and to overcome, as best they could, any fear they might feel.
Participants had access to control buttons in the scanner that they could use to inch the snake on the conveyor belt either closer to them or farther away, and in each round of the experiment, they chose one of these two options. When they opted to bring the snake closer, something remarkable happened: They showed more activity in a prefrontal cortex region called the subgenual anterior cingulate cortex, which is involved in regulating emotions, as well as the right temporal pole, which helps to shape behavioral responses. At the same time, activity in the amygdala, which processes fear and threat, diminished.
In short, it appeared that when people decided to bring the snake closer, their brain kept enough of a lid on the fear response to allow them to carry out their plan. Once they adjusted to the new situation, many felt bold enough to continue approaching the snake.
Well-established neuroplasticity findings suggest that small acts of moral courage can similarly beget acts of greater courage. “We can choose to bring the snake in a little bit closer,” says clinical psychiatrist Christian Heim, who is affiliated with the University of Queensland. “Or we can choose to say, ‘No, that’s it. That’s all I’m capable of. I’m going to push it away.’”
Former Capitol police officer Gonell has gotten comfortable bringing the snake closer in. At age 12, he immigrated to the U.S. from the Dominican Republic, and when he returned to his home country for visits, his grandfather Fillo would remind him to live his life with integrity.
Still, Gonell sometimes hesitated to act on his values. Conscious that his accent marked him as an outsider in his Brooklyn neighborhood, he was wary of making waves. But when the U.S. Army later shipped him to the Middle East for Operation Iraqi Freedom, he put his developing courage to the test, volunteering to drive supplies to Iraqi schools and U.S. troops despite the constant threat of roadside bombs. He received military honors for his bravery, including the National Defense Medal.
Serving as a Capitol police officer on January 6, 2021, brought Gonell to a key decision point. Defending the building in a gas mask and riot gear, Gonell battled dozens of insurgents and sustained multiple injuries, including chemical burns and a smashed foot that required surgery to repair. As he recovered, many people—and even some members of Congress—started spreading misinformation about what had happened at the Capitol that January day. Some said the incursion had been an antifa-led protest, and others insisted the insurrectionists had been peaceful.
Following his grandfather’s dictum, Gonell resolved to set the record straight. “This is something in our history that shouldn’t be kept quiet,” he says. He agreed to talk to CNN about what he had seen and heard on January 6: who he had encountered, what they had done to him and other officers. He was afraid of how people watching on TV, especially riot supporters, would respond, but he went through with the interview anyway.
That first appearance led to a series of other public engagements, including testifying before Congress. Each time Gonell told the truth openly, doing so felt a little bit easier, despite the danger he knew he could face. For the most part, he says, his experience speaking up has been positive: “I could look at myself in the mirror and look at my son and say, ‘Hey, I did the right thing.’”
Compared with Bentley’s actions, Gonell’s might seem to exist in a separate moral universe. Yet from a neural standpoint, moral deterioration and moral escalation are like trains running on parallel tracks in opposite directions. Similar neural structures of reward and habituation underlie them both. And just as similar brain processes evoke moral and physical disgust, related neural pathways evaluate both morality and beauty. The same brain region—the medial orbitofrontal cortex, which processes reward—evaluates both the attractiveness of a face and the virtue of a planned action. It’s no surprise, then, that moral ventures can be gratifying in much the same way as creating a work of art. People who are more moral, as judged by their peers, also have an enhanced sense of well-being, according to a cross-cultural study published earlier this year.
Further, people adapt to the behaviors they carry out frequently, which may make more extreme versions of those behaviors more likely. They also tend to repeat behaviors that they feel benefit them, whether these rewards are external (staving off financial collapse) or internal (the satisfaction of speaking truth to power).
Our brain’s propensity for habituation implies that the early stages of a moral trajectory may be the most crucial. “All the neural networks that we have are changeable,” Heim says. “If we use [them], they become stronger. If we don’t use them, they become weaker.” Once people understand how the brain gets accustomed to repeated behaviors, they can exercise more choice at the outset, asking themselves what kinds of actions they want to get comfortable with, what kind of beauty or integrity they want to strive for. Although the amygdala will almost certainly emit fear signals in situations that call for courage, what’s important is suppressing those signals enough to make virtuous action possible—and appreciating the inherent rewards of doing so.
Heim tries to encourage such habituation in his psychiatric practice. Because integrity can support mental well-being, he sometimes gives clients homework assignments such as telling a work supervisor they feel uncomfortable with a particular task. Heim’s objective is to help clients hold their own moral line, so he’s careful not to make these assignments too difficult. By demonstrating to themselves that they can act courageously, his clinical experience shows, people will reinforce mental pathways that will help them generate positive momentum and avoid moral collapse.
Self-reflection can play an important role in shifting the brain’s reward calculus and, by extension, help people make ethical decisions. In a 2023 study of moral judgment carried out in China, participants received eight weeks of mindfulness training, including meditation. Compared with a control group, those who received the training were less motivated to earn money if doing so would harm others. That altered preference showed up in their behavior. Those in the training group were not as open to giving someone an electric shock in exchange for cash, whereas control group members grew more inclined to deliver the shock over time.
Mindfulness practices may affect moral judgment in part because they promote a more objective outlook. It’s often easier for practitioners to take someone else’s view of a situation, which compels them to steer clear of harming others. Through skillful perspective taking, “I think we can always save ourselves,” Sharvit observes: identifying with others helps people guard against moral numbness and the negative spiraling that follows. “You won’t get habituated,” he adds. “You can connect.”
At an institutional level, one way to ward off downward moral slides might be to increase the penalties tied to each stage of moral descent—say, by announcing zero-tolerance antifraud company policies—and to underscore the rewards of holding the moral line. Leaders of organizations can, for instance, swiftly address transgressions and help employees get comfortable with admitting mistakes. In a Maastricht University study, participants whose bosses showed ethical leadership engaged in fewer corrupt acts such as offering bribes. Generally speaking, fraud and cover-ups seem less enticing in ethical workplace cultures, and telling the truth feels like an obligation, not an act of career sabotage.
Once people decide to act with integrity, their resolve is often socially contagious. When researchers told enrollees in the Milgram experiments to shock “learners” for answering questions wrong, people who saw others refusing to administer shocks were more apt to refuse as well. And researchers at Eastern Michigan University and elsewhere report that in work groups where members openly endorse ideals such as honesty and fairness, individual employees are often more likely to speak up about moral violations they see, perhaps as the result of virtuous peer pressure.
Had Bentley sought to modulate his own reward calculus before starting Bellatorum, he most likely never would have gotten in as deep as he did. He now says that, despite his fear, he should have admitted his mistake the moment his incorrect offer letters went out to landowners. That would have dinged the company’s reputation, but Bentley thinks that at that early stage, he could have bounced back. “I could’ve downsized to a very small crew and probably stayed in business,” he says. “Now I’m betting I’ve burned the bridges beyond all repair.”
Bentley also suspects that an unbending set of “flat-ass rules”—a term he borrowed from Operation Iraqi Freedom general James Mattis—could have saved him from becoming a stranger to himself, and research bears out his hunch. The stronger people’s advance intentions are to engage in certain types of behavior, a University of Sheffield meta-analysis shows, the more apt they are to follow through in real life.
Psychologists such as Zeno Franco of the Medical College of Wisconsin suggest cultivating what he calls the “heroic imagination”: our individual capacity to consider ahead of time what we’ll do in situations that call for moral courage, what values we will stand behind even under extreme pressure. In this kind of “What would I do?” scenario, the brain’s frontal cortex helps people anticipate how they will feel when they make certain moral choices, and those predicted feelings can influence their decisions in the long run.
When he started down the moral slope, Bentley did not know that living a lie would end up eating away at him like acid. “I would be driving over one of the high on-ramps that are so common in Houston and just think that I could drive my truck over the side,” he says. Finally, able to bear the guilt no longer, he turned himself in to federal officials in April 2021.
As he nears the end of his five-year sentence, Bentley still hopes to bend his arc toward redemption. He has written a memoir that frames his moral decline as a cautionary tale and shows how turning away from the truth led him to hunger more after that elusive ideal. “Never compromise your integrity for anything,” he now tells others, “not even when you think it’s essential to your survival.”
As for Gonell, he continues to speak and write about what happened at the Capitol on January 6, as well as about what he sees as ongoing threats to the rule of law in the U.S. He still receives threats from the public but remains undaunted. “What else you got? I’ve gone through war, I’ve been back, I’ve been injured, I’ve been ridiculed,” he says. “I’m not concerned about my life, even now, when some people say, ‘Hey, you should be careful.’”
Having considered how far he would go to ensure that truth prevails, Gonell has decided there’s basically no limit because the principle matters more to him than his own safety. Thirteenth-century theologian Thomas Aquinas saw integrity as synonymous with beauty that transcends outward appearances, and striving toward such a moral ideal gives people a profound sense of meaning in life. For Gonell, as for others on a similar path, the inner rewards of integrity more than outweigh the costs.Author: Madhusree Mukerjee. Elizabeth Svoboda. Source