Author Topic: How your brain makes moral judgments  (Read 654 times)

0 Members and 1 Guest are viewing this topic.

Oceander

  • Guest
How your brain makes moral judgments
« on: March 29, 2014, 05:52:12 am »
How your brain makes moral judgments

By Elizabeth Landau, CNN
updated 8:04 AM EDT, Thu March 27, 2014

(CNN) -- Imagine a CEO wants to profit from a venture that, by the way, involves emitting pollution toxic to the environment, but she doesn't care because the goal is profit.

Is the CEO intentionally harming the environment? What if, instead, the CEO is pushing a project that happens to help the environment -- is the benefit any more or less intentional than the harm in the other scenario? How do you morally judge each of these situations?

Science is still trying to work out how exactly we reason through moral problems such as these, and how we judge others on the morality of their actions, said Walter Sinnott-Armstrong, professor of practical ethics at Duke University.

Researchers interested in the neuroscience of morality are investigating which brain networks are involved in such decisions, and what might account for people's individual differences in judgments. Studies on the topic often involve small samples of people -- functional magnetic resonance imaging is time-intensive and expensive -- but patterns are emerging as more results come in.

"It's a field that's waiting for a big revolution sometime soon," Sinnott-Armstrong said.

A moral network?

Scientists have shown that there is a specific network of brain regions involved in mediating moral judgment. An influential study on this topic was published in 2001 and led by Joshua D. Greene, associate professor at Harvard University, author of "Moral Tribes: Emotion, Reason, and the Gap Between Us and Them."

Adrian Raine and Yaling Yang, in a 2006 review article, described this study as a breakthrough. It focused "on the specific difference between making judgments (i.e. 'appropriate' or 'inappropriate') on 'moral personal' dilemmas (e.g. throwing a person out of a sinking life-boat to save others), and 'moral impersonal' dilemmas (e.g. keeping money found in a lost wallet)," they wrote.

Greene's study suggested that three brain structures -- the medial prefrontal cortex, the posterior cingulate and angular gyrus on the left and right sides -- "play a central role in the emotional processes that influence personal moral decision-making," Raine and Yang wrote.

Other studies have since confirmed that these areas are important in processing information about moral decisions, as well as an area called the ventral prefrontal cortex.

Several researchers have additionally suggested that the brain areas involved in moral judgment overlap with what is called the "default mode network," which is involved in our "baseline" state of being awake but at rest. The network is also active during "internally focused tasks including autobiographical memory retrieval, envisioning the future, and conceiving the perspectives of others," Randy Buckner and colleagues wrote in a 2008 study.

To further understand which brain networks are essential for moral judgment, scientists study people whose behavior suggests that their relevant neural circuitry may be damaged.

What goes wrong in psychopaths


Psychopaths, particularly those who are also convicted criminals, have been the subject of much interest among scientists exploring moral judgment.

"They're not scared of punishment, they don't feel empathy towards other people, they don't respect authorities that told them not to do things, and so there's nothing stopping them from doing what other people would dismiss in a nanosecond," Sinnott-Armstrong said.

Raine and Yang suggest, based on research, that "antisocial groups" such as psychopaths may know what is moral, but they may lack a feeling of what is moral.


Read more ...