If you are a reader of this blog you are familiar with the number of posts I have shared about machine learning and causal inference and the benefits of education in economics. I have also discussed how there are important gaps sometimes between theory and application.
In this post I am going to talk about another important gap related to communication. How do we communicate the value of our work to a non-technical audience?
We can learn a lot from formal coursework, especially in good applied programs with great professors. But if not careful we can pick up on mental models and habits of thinking that turn out to weigh us down too, particularly for those that end up working in very applied business or policy settings. How we deal with these issues becomes important to career professionals and critical to those involved in science communication in general whether we are trying to influence business decision makers, policy makers, or consumers and voters.
In this post I want to discuss communicating with intent, paradigm gaps, social harassment costs, and mental accounting.
Communicating to Business and Non-Technical Audiences - or - The Laffer Curve for Science Communication
For those who plan to translate their science backgrounds to business audiences (like many data scientists coming from scientific backgrounds) what are some strategies for becoming better science communicators? In their book Championing Science: Communicating your Ideas to Decision Makers Roger and Amy Aines offer lots of advice. You can listen to a discussion of some of this at the BioReport podcast here.
Two important themes they discuss is the idea of paradigm gaps and intent. Scientists can be extremely efficient communicators through the lens of the paradigms they work in.
As discussed in the podcast, a paradigm is all the knowledge a scientist or economist may have in their head specific to their field of study and research. Unfortunately there is a huge gap between this paradigm and its vocabulary and what non-technical stakeholders can relate to. They have to meet stakeholders where they are, vs. the audience they may find at conferences or research seminars. From experience, different stakeholders and audiences across different industries have different gaps. If you work for a consultancy with external pharma clients they might have a different expectation about statistical rigor than say a product manager in a retail setting. Even within the same business or organization, the tactics used in solving for the gap for one set of stakeholders might not work at all for a new set of stakeholders if you change departments. In other words, know your audience. What do they want or need or expect? What are their biases? What is their level of analytic or scientific literacy? How risk averse are they? Answers to these questions is a great place to start in terms of filling the paradigm gaps and to address the second point made in the podcast - speaking with intent.
As discussed in the podcast: "many scientists don't approach conversations or presentations with a real strategic intent in terms of what they are communicating...they don't think in terms of having a message....they need to elevate and think about the point they are trying to make when speaking to decision makers."
As Bryan Caplan states in his book The Myth of the Rational Voter, when it comes to speaking to non-economists and the general public, they should apply the Laffer curve of learning, "they will retain less if you try to teach them more."
He goes on to discus that its not just what we say, but how we position it, especially when dealing with resistance related to misinformation and disinformation and systemic biases:
"irrationality is not a barrier to persuasion, but an invitation to alternative rhetorical techniques...if beliefs are in part consumed for their direct psychological benefits then to compete in the marketplace of ideas, you need to bundle them with the right emotional content."
In the Science Facts and Fallacies podcast (May 19, 2021) Kevin Folta and Cameron English discuss:
"We spend so much time trying to convince people with scientific principles....it's so important for us to remember what we learn from psychology and sociology (and economics) matters. These are turning out to be the most important sciences in terms of forming a conduit through which good science communication can flow."
Torsten Slok offers great advice in his discussion with Barry Ritholtz about working in the private sector as a PhD economist in the Masters in Business Podcast back in 2018:
"there is a different sense of urgency and an emphasis on brevity....we offer a service of having a view on what the economy will do what the markets will do - lots of competition for attention...if you write long winded explanations that say that there is a 50/50 chance that something will happen many customers will not find that very helpful."
So there are a lot of great data science and science communicators out there with great advice. A big problem is this advice is often not part of the training that many of those with scientific or technical backgrounds receive, and an even bigger problem is that it is often looked down upon and even punished! I'll explain more below.
The Negative Stigma of Science Communication in the Data Science and Scientific Community
One of the most egregious things I see on social media is someone trying their best to help mentor those new to the analytical space (and improve their own communication skills) by sharing some post that attempts to describe some complicated statistical concept in 'layman's' terms - to only be rewarded by harassing and trolling comments. Usually this is about how they didn't capture every particular nuance of the theory, failed to include a statement about certain critical assumptions, or over simplified the complex thing they were trying to explain in simple terms to begin with. This kind of negative social harassment seems to be par for the course when attempting to communicate statistics and data science on social media like LinkedIn and Twitter.
Similarly in science communication, academics can be shunned by their peers when attempting to do popular writing or communication for the general public.
In 'The Stoic Challenge' author William Irvine discusses Danial Kahneman's challenges with writing a popular book:
"Kahneman was warned that writing a popular book would cause harm to his professional reputation...professors aren't supposed to write books that normal people can understand."
He describes, when Kahneman's book Thinking Fast and Slow made the New York Times best selling list Kahneman "sheepishly explained to his colleagues that the book's appearance there was a mistake."
In an EconTalk interview with economist Steven Levitt, Russ Roberts asks Levitt about writing his popular book Freakonomics:
"What was the reaction from your colleagues in the profession...You know, I have a similar route. I'm not as successful as you are, but I've popularized a lot of economics...it was considered somewhat untoward to waste your time speaking to a popular audience."
Levitt responded by saying the reaction was not so bad, but the fact that Russ had to broach the topic is evidence of the toxic culture that academics face when doing science communication. The negative stigma associated with good science communication is not limited to economics or the social and behavioral sciences.
In his Talking Biotech podcast episode Debunking the Disinformation Dozen, scientist and science communicator Kevin Folta discusses his strident efforts facing off these toxic elements:
"I have always said that communication is such an important part of what we do as scientists but I have colleagues who say you are wasting your time doing this...Folta why are you wasting your time doing a podcast or writing scientific stuff for the public."
Some of this is just bad behavior, some of it is gatekeeping done in the name of upholding the scientific integrity of their field, some of it is the attempt of others to prove their competence to themselves or others, and maybe some of it is the result of people genuinely trying to provide peer review to their colleagues that they think have gone astray. But most of it is unhelpful when it comes to influencing decision makers or improving general scientific literacy. It doesn't matter how great the discovery, how impactful the findings, we have all seen from the pandemic that effective science communication is critical for overcoming the effects of misinformation and disinformation. A culture that is toxic toward effective science communication becomes an impediment to science itself and leaves a void waiting be filled by science deniers, activists, policy makers, decision makers, and special interests.
This can be challenging when you add the Dunning-Kruger effect to the equation. Those that know the least may be the most vocal while scientists and those with expertise sit on the sidelines. As Bryan Caplan states in his book The Myth of the Rational Voter:
"There are two kinds of errors to avoid. Hubris is one, self abasement is the other. The first leads experts to over reach themselves; the second leads experts to stand idly by while error reigns."
How Does Culture and Mental Accounting Impact Science Communication?
So as I've written above, in the scientific community there is sort of a toxic culture that inhibits good science communication. In the Two Psychologists Four Beers podcast behavioral scientist Nick Hobson makes an interesting comparison between MBAs and scientists.
"as scientists we need to be humble with regards to our data...one thing we are learning from our current woes of replication (the replication crisis) is we know a lot less than we think. This has conditioned us to be more humble....vs. business school people that are trained to be more assertive and confident."
I'd like to propose an analogy relating to mental accounting. It seems like when a scientist gets their degree it comes with a mental account called scientific credibility. Speaking and writing to a general audience risks taking a charge against that account, and they are trained to be extremely frugal about managing it. Communication becomes an exercise in risk management. If they say or communicate something that is of the slightest error, missing the slightest nuance, a colleague may call them out. Gotcha! Psychologically, this would call for a huge charge against their 'account' and reputation. Its not quite a career ending mistake like making a fraudulent claim or faking data, but it's bad enough to be avoided at great cost. MBAs don't have a mental account called scientific credibility. They aren't long on academic credibility so they don't require putting on the communication hedges the way scientists often do. They come off as better communicators and more confident while scientists risk becoming stereotyped as unable to be effective communicators.
To protect their balance at all costs and avoid social harassment from their peers, economists and scientists may tend to speak with caveats, hedges, and qualifications. This may also mean a delayed response. Before even thinking about communicating results in many cases requires in depth rigorous analysis, sensitivity checks etc. It requires doing science which is by nature slow while the public wants answers fast. Faster answers might mean less time for analysis which calls for more caveats. This can all be detrimental to effective communication to non-technical audiences. Answers become either too slow or too vague to support decision making (recall Torsten Slok's comments above). It gives the impression of a lack of confidence and relevance and a stereotype that technical people (economists, scientists, data scientists etc.) fail to offer definitive or practical conclusions. As Bryan Caplan notes discussing the role of economists in The Myth of the Rational Voter:
"when the media spotlight gives other experts a few seconds to speak their mind, they usually strive to forcefully communicate one or two simplified conclusions....but economists are reluctant to use this strategy. Though the forum demands it they think it unseemly to express a definitive judgement. This is a recipe for being utterly ignored."
Students graduating from economics and science based graduate programs may inherit these mental accounts and learn these 'hedging strategies' from their professors, from the program, and the seminar culture that comes with it.
Again, Nick Hobson offers great insight about how to deal with this kind of mental accounting in his own work:
"what I've wrestled with as I've grown the business is maintaining scientific integrity and the rigor but knowing you have to sacrifice some of it....you have to find and strike a balance between being data driven and humble while also being confident and strategic and cautious about the shortcuts you take."
In Thinking Fast and Slow, Kahneman argues that sometimes new leaders can produce better results because fresh thinkers can view problems without the same mental accounts holding back incumbents. The solution isn't to abandon scientific training and the value it brings to the table in terms of rigor and statistical and causal reasoning. The solution is to learn how to view problems in a way that avoids the kind of mental accounting I have been discussing. This also calls for a cultural change in the educational system. As Kevin Folta stated in the previous Talking Biotech Podcast:
"Until we have a change in how the universities and how the scientific establishment sees these efforts as positive and helpful and counts toward tenure and promotion I don't think you are going to see people jump in on this."
Given graduate and PhD training may come with such baggage, one alternative may be to develop programs with more balance, like Professional Science Master's degrees or at least create courses or certificates that focus on translational knowledge and communication skills. Or seek out graduate study under folks like Dr. Folta who are great scientists and researchers that can also help you overcome the barriers to communicate science effectively. If that is the case we are going to need more Dr. Folta's.
The Myth of the Rational Voter: Why Democracies Choose Bad Policies. Bryan Caplan. Princeton University Press. 2007.
The stoic challenge : a philosopher's guide to becoming tougher, calmer, and more resilient. William Braxton Irvine. Norton & Co. NY. 2019
The Analytics Lifecycle Toolkit: A Practical Guide for an Effective Analytics Capability. Gregory S. Nelson. 2018.