Wednesday, July 7, 2021

R.A Fisher, Big Data, and Pretended Knowledge

In Thinking Fast and Slow, Kahneman points out that what matters more than the quality of evidence, is the coherence of the story. In business and medicine, he notes that this kind of 'pretended' knowledge  based on coherence is often sought and preferred. We all know that no matter how great the analysis, if we can't explain and communicate the results with influence, our findings may go unappreciated.  But as we have learned from misinformation and disinformation about everything from vaccines to GMOs, Kahneman's insight is a double edge sword that cuts both ways. Coherent stories often win out over solid evidence and lead to making the wrong decision. We see this not only in science and politics, but also in business. 

In the book The Lady Tasting Tea by David Salsburg, we learn that R.A. Fisher was all too familiar with the pitfalls of attempting to innovate based on pretended knowledge and big data (excerpts):

"The Rothamsted Agricultural Experiment Station, where Fisher worked during the early years of the 20th century, had been experimenting with different fertilizer components for almost 90 years before he arrived...for 90 years the station ran experiments testing different combinations of mineral salts and different strains of wheat, rye, barley, and potatoes. This had created a huge storehouse of data, exact daily records of rainfall and temperature, weekly records of fertilizer dressings and measures of soil, and annual records of harvests - all of it preserved in leather bound notebooks. Most of the 'experiments' had not produced consistent results, but the notebooks had been carefully stored away in the stations archives....the result of these 90 years of 'experimentation' was a mess of confusion and vast troves of unpublished and useless data...the most that could be said of these [experiments] was that some of them worked sometimes, perhaps, or maybe."

Fisher introduced the world to experimental design and challenged the idea that scientists could make progress by tinkering alone. Instead, he motivated them to think through inferential questions- Is the difference in yield for variety A vs variety B (signal) due to superior genetics, or is this difference what we would expect to see anyway due to natural variation in crop yields (noise) i.e. in other words is the differences in yield statistically significant? This is the original intention of the concept of statistical significance that has gotten lost in the many abuses and misinterpretations often hear about. He also taught us to ask questions about causality, does variety A actually yield better than variety B because it is genetically superior, or could differences in yield be explained by differences in soil characteristics, weather/rainfall, planting date, or numerous other factors. His methods taught us how to separate the impact of a product or innovation from the impact and influences of other factors.

Fisher did more than provide a set of tools for problem solving. He introduced a structured way of thinking about real world problems and the data we have to solve them.  This way of thinking moved the agronomists at Rothamsted away from from mere observation to useful information. This applies not only to agriculture, but to all of the applied and social sciences as well as business.

In his book Uncontrolled, Jim Manzi stressed the importance of thinking like Fisher's plant breeders and agronomists (Fisher himself was a geneticist) especially in business settings. Manzi describes the concept of 'high causal density' which is the idea that often the number of causes of variation in outcomes can be enormous with each having the potential to wash out the cause we are most interested in (whatever treatment or intervention we are studying). In business, which is a social science, this becomes more challenging than in the physical and life sciences. In physics and biology we can assume relatively uniform physical and biological laws that hold across space and time. But in business the 'long chain of causation between action and outcome' is 'highly dependent for its effects on the social context in which it is executed.' It's all another way of saying that what happens in the outside world can often have a much larger impact on our outcome than a specific business decision, product, or intervention. As a result this calls for the same approach Fisher advocated in agriculture to be applied in business settings. 

List and Gneezy address this in The Why Axis:

"Many businesses experiment and often...businesses always tinker...and try new things...the problem is that businesses rarely conduct experiments that allow a comparison between a treatment and control group...Business experiments are research investigations that give companies the opportunity to get fast and accurate data regarding important decisions."

Fisher's approach soon caught on and revolutionized science and medicine, but in many cases is still lagging adoption in some business settings in the wake of big data, AI, and advances in machine learning. As Jim Manzi and Stefan Thomke state in Harvard Business Review in the absence of formal randomized testing and good experimental design:

"executives end up misinterpreting statistical noise as causation—and end up making bad decisions"

In The Book of Why, Judea Pearl laments the reluctance to embrace causality: 

"statistics, including many disciplines that looked to it for guidance remained in the prohibition era, falsely believing that the answers to all scientific questions reside in the data, to be unveiled through clever data mining tricks...much of this data centric history still haunts us today. We live in an era that presumes Big Data to be the solution to all of our problems. Courses in data science are proliferating in our universities, and jobs for data scientists are lucrative in companies that participate in the data economy. But I hope with this book to convince you that data are profoundly dumb...over and over again, in science and business we see situations where more data aren't enough. Most big data enthusiasts, while somewhat aware of those limitations, continue to chase after data centric intelligence."

These big data enthusiasts strike a strong resemblance to the researchers at Rothamsted before Fisher. List has a similar take:

"Big data is important, but it also suffers from big problems. The underlying approach relies heavily on correlations, not causality. As David Brooks has noted, 'A zillion things can correlate with each other depending on how you structure of the data and what you compare....because our work focuses on field experiments to infer causal relationships, and because we think hard about these causal relationships of interest before generating the data we go well beyond what big data could ever deliver."

We often want fast iterations and actionable insights from data. While it is true, a great analysis with no story delivered too late is as good as no analysis, it is just as true that quick insights with a coherent story based on pretended knowledge from big data can leave you running in circles getting nowhere - no matter how fast you might feel like you are running. In the case of Rothamsted, scientists ran in circles for 90 years before real insights could be uncovered using Fisher's more careful and thoughtful analysis. Even if they had today's modern tools of AI and ML and data visualization tools to cut the data 1000 different ways they still would not have been able to get any value for all of their effort. Wow, 90 years! How is that for time to insight? In many ways, despite drowning in data and the advances in AI and machine learning, many areas of business across a number of industries will find themselves in the same place Fisher found himself at Rothamsted almost 100 years ago. We will need a credibility revolution in AI to bring about the kind of culture change that will make the kind of causal and inferential thinking that comes natural to today's agronomists (thanks to Fisher) or more recently the way Pearl's disciples think about causal graphs more commonplace in business strategy. 

Notes: 

1) Randomized tests are not always the only way to make causal inferences. In fact in the Book of Why Pearl notes in relation to smoking and lung cancer, outside of the context of randomized controlled trials "millions of lives were lost or shortened because scientists did not have adequate language or methodology for answering causal questions." The credibility revolution in epidemiology and economics, along with Pearl's work has provided us with this language. As Pearl notes: "Nowadays, thanks to carefully crafted causal models, contemporary scientists can address problems that would have once been considered unsolvable or beyond the pale of scientific inquiry." See also: The Credibility Revolution(s) in Econometrics and Epidemiology.

2) Deaton and Cartwright make strong arguments challenging the supremacy of randomized tests as the gold standard for causality (similar to Pearl) but this only furthers the importance of considering careful causal questions in business and science by broadening the toolset along the same lines as Pearl. Deaton and Cartwright also emphasize the importance of interpreting causal evidence in the context of sound theory. See: Angus Deaton, Nancy Cartwright,Understanding and misunderstanding randomized controlled trials,Social Science & Medicine, Volume 210, 2018.

3) None of this is to say that predictive modeling and machine learning cannot answer questions and solve problems that create great value to business. The explosion of the field of data science is an obvious testament to this fact. Probably the most important thing in this regard is for data scientists and data science managers to become familiar with the important distinctions between models and approaches that explain or predict. See also: To Explain or Predict and Big Data: Don't Throw the Baby Out with the Bathwater

Additional Reading

Will there be a credibility revolution in data science and AI? 

https://econometricsense.blogspot.com/2018/03/will-there-be-credibility-revolution-in.html 

Statistics is a Way of Thinking, Not a Toolbox

https://econometricsense.blogspot.com/2020/04/statistics-is-way-of-thinking-not-just.html 

The Value of Business Experiments and the Knowledge Problem

https://econometricsense.blogspot.com/2020/04/the-value-of-business-experiments-and.html

The Value of Business Experiments Part 2: A Behavioral Economic Perspective

http://econometricsense.blogspot.com/2020/04/the-value-of-business-experiments-part.html 

The Value of Business Experiments Part 3: Innovation, Strategy, and Alignment 

http://econometricsense.blogspot.com/2020/05/the-value-of-business-experiments-part.html 

Big Data: Don't Throw the Baby Out with the Bathwater

http://econometricsense.blogspot.com/2014/05/big-data-dont-throw-baby-out-with.html 

Big Data: Causality and Local Expertise Are Key in Agronomic Applications

http://econometricsense.blogspot.com/2014/05/big-data-think-global-act-local-when-it.html

The Use of Knowledge in a Big Data Society

https://www.linkedin.com/pulse/use-knowledge-big-data-society-matt-bogard/