One question that is pertinent to politics as well as psychology is the nature of moral progress. When I say moral progress, I mean the process by which individuals end up updating or modifying their basic moral beliefs (or priors). This process usually is a slow one, and at the micro level involves one’s reaction to evidences or thought processes.
This typically happens when Person A comes across some data/evidence that is in tension with his moral system. For example, A may value a certain principle and then realize one day that some regular action of his violates this principle. Or maybe A values several principles, and new data (or just new reasoning) seems to suggest that in at least some instance these principles are in conflict.
To give a couple of examples:
Time: 1790. Place: America. A values both individual liberty and a harmonious, prosperous society. The issue at hand is slave ownership. A reluctantly accepts slave ownership for the time being because he believes that Blacks are intellectually inferior and would not be able to live in the same land as the Whites. Perhaps A supports emancipation in principle but thinks an actual implementation would result in tremendous disorder, huge decrease in prosperity and would also require eventual deportation of all the Blacks back to Africa in a painful, costly and disrupting process. But one day his scientist friend shows him evidence that seems to strongly suggest that the inferiority of Blacks is a myth, and given proper education they would be as likely as Whites to succeed in intellectual endeavors.
Or to give a second example, A is a young European, living currently, who has a strong moral opposition to hunting for pleasure. He thinks it is wrong and rights-violating. Yet he eats meat. He justifies this by saying that killing for food or to achieve some other basic necessity is ok, but killing for pleasure is morally wrong. But one day, after a conversation with a friend he starts to wonder if his position is morally sound. He realizes that he can get by without eating animals (gaining the needed protein from other sources, such as lentils, milk and soy, as many Asians do) so the main reason behind eating meat is the pleasure he gains from it. So how is eating meat different form hunting then?
And so on…
The interesting question to me, is what A does in such a situation. He has several choices:
1. Simple minded denial: He can just deny that the evidence exists. For instance the 18th century American could refuse to believe his scientist friend. He could claim that the facts and the research are false and move on. We seem to see something similar with some (not all) global warming sceptics today.
2. Tweaking: He can decide that despite the new evidence/argument, he can resolve the tension with minor tweaks. For instance, he comes up with other evidence or arguments to counter the tension. Or he makes minor changes to his priors that make this tension go away or at least become less pronounced. There are many ways to tweak one’s beliefs, some simple, some highly complex; some honest, some not, some based on reason, some based on emotion.
3. Biting the bullet: He can decide that his values are truly in conflict and modify them significantly. The 18th century American could either give up his belief in liberty, or abandon his support for slave-ownership. The 2oth century European could decide that animals don’t have rights (and end his moral opposition to hunting) or decide to become a vegetarian. Any of these outcomes are what I’d call significant moral progress. At the individual level, they can be life-changing.
It seems to me that personality plays a complex role in deciding which of the above outcomes occur. As a rule, people have a strong emotional resistance to any sort of change in their moral priors. For that would mean acknowledging to themselves, and perhaps to others, that they have engaged in beliefs/actions that are false/evil. Some make a conscious attempt to avoid letting emotions take precedence over reason in deciding how one deals with such conflicts, while others go with the flow.
Age probably plays an important role in all this; younger people are more likely to change their belief systems. As Fitzgerald once wrote, “At eighteen our convictions are hills from which we look; at forty-five they are caves in which we hide.”
In any case, I don’t have any deep insights to offer, but I think these are interesting questions, and being able to deal with moral dilemmas in an efficient, unbiased and rational manner would certainly improve political outcomes.