Ever since I listened to Michael Taft interview David McRaney on his Deconstructing Yourself podcast, I've been eagerly anticipating his new book subtitled "The Surprising Science of Belief, Opinion, and Persuasion". Unfortunately, I think coming to McRaney's book via this route may have given me an exaggerated expectation of its profundity that it wasn't able to live up to. At its deepest level -- the level I took it at in the context of meditation -- McRaney's question is how we form our beliefs about the world, and thus how those beliefs might change. While it was an interesting read, the book stops far short of posing this question in a philosophical way. Instead, McRaney aims at something much more mundane, captured in the fact that it's nominated for a prize in the marketing/communications&sales/influence category of business books. Essentially, it's about how to get someone to agree with you. Yes, reading between the lines, you can see that some of what McRaney has to say is relevant to questioning ones own beliefs. But the vast majority of examples and explanations throughout the book rely on the premise that you are clearly right and they are obviously wrong. Because these wrong folks are unable to see the error of their ways, you have to use the counter-intuitive persuasive techniques that McRaney describes to get them to see the light. Just which techniques these are, and the science behind why they work, are the main meat of the book. So the deeper question of how we know what's 'right' basically just gets bracketed.
It's not entirely fair though, to judge a book by what it's not. So let me provide a quick summary of what it actually is. Because it actually lays out quite an interesting story. McRaney opens by asking how the notorious 9/11 conspiracy theorist Charlie Veitch could ever have changed his mind and publicly recanted his belief that the attacks were an inside job. How did he come to hold these beliefs despite all sorts of seemingly compelling arguments to the contrary, and then why did he suddenly change them after many years? He proceeds to organize most of the rest of the book as the story of his own quest to understand how this reversal was possible.
His first stop is to speak with David Fleischer about the power of "deep canvassing" to shift people's opinion on the issue of gay rights. The deep canvassers approach is to go door-to-door engaging people in open conversations about their attitudes regarding gay people. They discover that people form these attitudes based on all sorts of personal experiences they've had or communities they've been a part of and almost never on any sort of facts. As a result, bombarding them with facts or browbeating them with what you think they should believe does nothing to change their attitude. Instead, any potential change is catalysed mainly by listening to folks talk about the stories and emotions they have surrounding these issues. Not only does this openness make them feel less attacked, but as they explain themselves they are forced to reflect on their own beliefs and experiences in a new way.
McRaney then spends some time discussing scientific studies proving the efficacy of deep canvassing, and delving into others that propose a model of how it works. Unfortunately, his explanations of the science here don't go very deep. We hear a little about "elaboration" and "perspective taking" and "SURFPAD". But McRaney is writing a marketing/communications&sales/influence book here, so we basically just learn that the brain doesn't work as a bottom-up information processing machine, but is in fact largely a top-down information guessing machine whose guesses are based on our past experiences. The predictive processing model of the brain is a profound idea with far out consequences and implications, but McRaney only pursues it to the level of optical illusions like the dress. His take away is just that when the correct interpretation of the world is uncertain, we can be wrong about something yet still feel a complete confidence that our perception is accurate. In this case, it becomes impossible to change a mind with 'the facts' because those facts are too ambiguous to be interpreted without some priors. The only way we can overcome this phenomenon is to force people to articulate their priors and see that they could well differ from other folks'.
Naturally then, McRaney's next question is: what types of situations force us to take a step back and call into question the mental model we use to make sense of the world around us? The answer to this is fairly obvious: when our model fails. We only think when we are forced to. McRaney references but doesn't really flesh out two models of how these paradigm shifts can take place. -- Kuhn's scientific revolution, and Piaget's shift from assimilation to accommodation. For him, the point of both of these is that when a few things don't fit into our model of the world or we make a few mistakes based on it, we don't tend to change anything about how we think. It's only past a certain critical level of cognitive dissonance that we consider the hard work of examining our views.
However, because the subjects in Kuhn's and Piaget's theories (scientists and children, respectively) are presumed to be open to revising their model should its failures become increasingly apparent, this relatively simple explanation of paradigm shift is not going to explain why someone like Charlie Veitch changed his mind. After all, like all conspiracy theorists, Veitch had spent years examining the exact same facts that lead the rest of us to conclude that, obviously, 9/11 was a terrorist attack. Instead of being convinced that his theory needed updating, Veitch became convinced that his conspiracy theory was right, and that what counted as a 'fact' was the problem. There's more than one way to reduce cognitive dissonance. So the question now becomes: what situations can induce this cognitive dissonance and cause someone to reevaluate their priors when they deliberately surround themselves by forces that prevent them from ever encountering a fact that causes their model to fail? In other words, how could someone like, say, a sheltered and indoctrinated member of a cult, ever change their mind?
To pursue this next thread, McRaney talk to several people who left the Westboro baptist church, a famously anti-gay religious cult. He discovers their stories share a key feature. None of these people left the cult because they objected to its views of gay people. They all left because, one the one hand, the politics of the institution resulted in other members treating them inhumanely, and on the other, they happened into interactions with humane and sympathetic outsiders. It was only after they left the cult for these personal or social reasons that they began to reevaluate, and then abandon, their prior convictions about gay people. In short, we mostly believe what those close to us believe, and we don't bother much about whether these beliefs are based on facts so long as we feel taken care of by our tribe.
This turns out to be the key to Charlie Veitch's conversion. It happens that just before recanting his 9/11 theory, he had gotten involved with a hippy new age group who prized a free spirited inquiry into 'the truth'. In fact, he had even begun dating someone he met in this group. So it was not so much the facts that finally forced him to chang his model, but the availability of a new model that corresponded to a new group perspective. Before that point, changing his mind would have meant abandoning his role as a prominent member of the 9/11 conspiracy theory community and striking out into the social wilderness. Afterwards, changing his mind just amounted to shifting from one group to a different, more welcoming, one. The tipping point is not created by an accumulation of facts that contradict a model, but by the contrast between the ways we feel in two different groups. In fact, given this conclusion, is seems inaccurate to suggest that there has been any mental accommodation analogous to Piaget's theory. Nor has Kuhn's revolution based on new facts happened in any individual's head. The truth here is closer to Max Plank's notion that "science advances one funeral at a time". One group theory has been replaced by another. The truth is social.
Okay, but then how do groups come to believe what they believe? Well, they argue about it. And the reasons people use in these arguments are not designed for an individual to get at the truth, but to convince a skeptical group. McRaney paints a picture of reasoning, based on the work of Mercier and Sperber, as a mechanism that only makes sense at the group level because it is ultimately more a mechanism of producing consensus than logically and empirically evaluating facts. This accounts for the fact that we reason much better in a group setting than by ourselves. As a faculty, reason is geared toward forcefully putting forward our own biased perspective while simultaneously challenging someone else's bias. This picture also accounts for the many demonstrated failures of individual reasoning that have become so familiar from the work of Kahneman and other behavioral economists. Our brains actually outsource the time consuming process of considering counterexamples and carefully weighing various perspectives to the group interaction. In short, the tool of rationality wasn't designed with any notion of 'truth' in mind, but simply with the goal of holding a group together in performing some action that does not kill everyone. You would think that after this revelation McRaney would need to overhaul his own theory of truth. I mean, "the truth is whatever everybody else thinks is true subject only to constraint that it doesn't lead to our mutual extinction," is not a slogan that likely stirs the righteous heart. But McRaney is not out to pursue the philosophical implications of the view he has come to. As I observed at the outset, he's not really interested in how we collectively invent this notion of truth, but only in understanding how we can change the minds of those who stubbornly defy that consensus.
Accordingly, the final chapters of the book focus on research into persuasion. His first stop is something called the Elaboration Likelihood Model of persuasion. The idea is to predict what kinds of messages people will find persuasive under what circumstances. Like it's cousin, the Heuristic-Systematic Model, the goal is to predict when folks will think deeply and rationally about the content of the message, and when they will be more influenced by the style or circumstances of its presentation. The basic idea McRaney draws from either is pretty common sensical. You are more likely to engage with a message is it comes from someone you consider trustworthy and reliable, when it comes paired with its likely counterarguments, when it isn't too complicated for you, and when the medium chosen is suitable for the content of the message. Again, despite everything McRaney has said along the way, the unspoken assumption here is that the truth is something we come to after a careful and rational consideration of the facts. And since the only (legitimate) reason to try and change someone's mind is when it has departed from this truth, the goal of persuasion should be to construct a setting and message that gets people to rationally examine the facts, thus overcoming their tendency to lazily believe what they already believe. In other words, it's clear that McRaney is just going to leave several large cans of worms open on the table. What if you just want to convince people to buy your McProduct instead of someone else's nearly identical one? The truth is that the vast majority of the times our mind is made up to begin with don't depend on facts and rational arguments but on the dark art of advertising's exploitation of our desire for conformity. What if you take seriously the idea that 'truth' is always a socially constructed human phenomena (that nevertheless interacts with 'facts' that originate outside the human world)? Wouldn't you then have to admit that the persuasive techniques he's advocating are, on a philosophical level, no less than their own form of advertising?
The finale of the book skips over these difficult questions and simply presumes that if you somehow trick your audience into thinking deeply about something, they will inevitably end up agreeing with you, because ... well, obviously you are right! So the final chapters simply review the concrete steps that various groups have found work best to get a hostile audience to at least consider you message. He compares the techniques used by deep canvassing, street epistemology, smart politics, and motivational interviewing. Essentially, these all work the same way. First, give up the idea that you are going to convince someone with the facts. Instead, establish some rapport with them by asking questions, listening to their story, and opening a non-judgmental space. Then, ask them how strongly they feel about the topic at hand. The key thing seems to be to get them to quantify the strength of their beliefs. After that you can ask them how they arrived at that level of belief rather than another, and you can ask them what might change that level. In other words, you begin to investigate their reasons for their belief and the quality of these reasons. This forces the other person to step back from what they believe and begin to think about why they believe this. The meta-cognition introduces them (at least implicitly) to the central thesis of the book, namely that there are social and psychological forces behind belief that have nothing to do with the content of the beliefs themselves. To maintain the rapport and non-judgmental aspect of things, you do not try to argue with their reasoning (just as you did not try to argue with the content of their beliefs). You simply show that you are actively listening by repeating what they told you and confirming that you understood it correctly. And that's it. You don't change their mind -- they do.
No comments:
Post a Comment