There are two features of human nature that are bewildering. One is the fact that in human reasoning too often “facts are white noise and emotions rule.” That is a quote from an adviser to the leaders promoting Britain’s exit from the European Union. It also happens to be a fundamental truth of human nature. One excellent article on this subject http://westsidetoastmasters.com/resources/laws_persuasion/chap14.html has pointed out that motivational writer Dale Carnegie has written: “when dealing with people, remember you are not dealing with creatures of logic, but with creatures of emotion, creatures bristling with prejudice and motivated by pride and vanity.” That’s why our message has to focus on emotions while maintaining a balance between logic and feelings. Logic and emotion are the two elements that make for a perfect persuasion. We can be persuasive using only logic or only emotion, but the effect will be short-term and unbalanced.
Emotions create movement and action. They generate energy during the presentation and get prospects to act on the proposal being presented. The balance between logic and emotion could be called the twin engines of persuasion and influence. Some people need more logic than emotion. Others may need more emotion unless logic. There is a balance. In most situations, people react based on emotions, then justify their actions with logic in fact. We are persuaded by reason, but we are moved by emotion. Several studies conclude that up to 90% of the decisions we make are based upon a emotion. The article rightly notes that we use logic to justify our actions which really involve emotion. Our heads tell us not to believe everything we hear, that politicians are bunch of liars, but our hearts are won over by their impassioned speeches. That’s the power of emotion.
The second feature of human nature is the more perplexing issue regarding the fact we continue to believe in objectively false things even after the truth is clear. The New York Times on March 22 of this year published an article regarding this phenomenon. https://www.nytimes.com/2017/03/22/upshot/why-objectively-false-things-continue-to-be-believed.html?_r=0 It points out the saying: “everyone is entitled to his own opinion, but not his own facts.” However, it turns out we do cling to “our own facts” in spite of the truth. The political events involving Congress and the president have sharply illustrated that the truth doesn’t always count in our minds. Even after Mr. Trump’s claims about former president Obama wiretapping him during the campaign were totally debunked, supporters continued to believe the claim. Republican senators who blocked President Obama’s nomination to the US Supreme Court for many months out of political self-interest, are totally comfortable with their hypocritical outrage that Democrats are unfair in their opposition to Mr. Trump’s nominee. They are self-righteous in their belief that there is no connection between what they did and the Democrats position.
Mr. Trump, in a unique manner, has no reluctance to continually offer “alternative truths” in an appeal to his supporters. In that regard, the article claims that:
“Mr. Trump, perhaps unconsciously, has grasped a core truth of modern politics: that voters tend to seek out information that fits the story they want to believe, usually one in which members of the other party are the bad guys.”
It appears that even when our false myths are dispelled and debunked, their effects linger. For example Mr. Trump insisted over a period of months that Mr. Obama hadn’t been born in the United States but conceded in 2016 that in fact he had been born in the United States. In a recent poll 43% of the registered voters still believed he had not been born in the United States.
There is an excellent article from the Boston Globe http://archive.boston.com/bostonglobe/ideas/articles/2010/07/11/how_facts_backfire/?page=1 entitled “How Facts Backfire by Joe Keohane. What follows are almost all quotes from his excellent article without quotation marks, but to his credit. I recommend the article for a fuller understanding. He starts out with the fact that:
“In the end, truth doesn’t win out.” Studies at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an under powered antibiotic, facts could actually make misinformation even stronger.
One reason is that research has found ” that it’s absolutely threatening to admit you’re wrong,” The phenomenon — known as “backfire” — is “a natural defense mechanism to avoid that cognitive dissonance” according to the Michigan research. People aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. They have existing value beliefs and bias which influence their idea of truth and reality. They can cause us to twist facts so they fit better with our preconceived notions. Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we’re right, and even less likely to listen to any new information.
There is a substantial body of psychological research showing that people tend to interpret information with an eye toward reinforcing their preexisting views. If we believe something about the world, we are more likely to passively accept as truth any information that confirms our beliefs, and actively dismiss information that doesn’t. This is known as “motivated reasoning.” Whether or not the consistent information is accurate, we might accept it as fact, as confirmation of our beliefs. This makes us more confident in these beliefs, and even less likely to entertain facts that contradict them.
And if you harbor the notion — popular on both sides of the aisle — that the solution is more education and a higher level of political sophistication in voters overall, well, that’s a start, but not the solution. A 2006 study by Charles Taber and Milton Lodge at Stony Brook University showed that politically sophisticated thinkers were even less open to new information than less sophisticated types. These people may be factually right about 90 percent of things, but their confidence makes it nearly impossible to correct the 10 percent on which they’re totally wrong.
Now apply these two concepts about truth and about facts to your jurors. Our job is to present our cases in a way that is consistent with these primary existing beliefs or to show that our cases are an exception outside the application. We can know how best to go about this through well conducted focus studies of various kinds and by our ability to create accurate juror profiles. The more we know about how human’s arrive at decisions and our willingness to abandon false ideas we have about human nature, the better our chance of success.