In 1975, Richard Holbrooke, a U.S. diplomat most well-known for brokering peace in Bosnia, wrote a short essay in Harper’s entitled, “The smartest man in the room.”
In it, Holbrooke discusses a dilemma he experienced during the Vietnam war, “[The] smartest man in the room is not always right.” Holbrooke associates this dilemma with a belief that data-driven decisions are always the right decisions and that the fastest talker with the most data is the person you should listen to. He laments an organizational culture where: “People who had important things to say were cut off in mid-thought because they were not articulate enough to frame their thoughts in the precise, logical, bright way that was desired, if not required.”
This perceived dichotomy between intuition and data – and related notions about the kind of people who use each — is alive and well some 40 years later. It’s one of the themes in a recent report from the MIT Sloan Management Review, (sponsored by SAS), entitled, "Beyond the Hype: The Hard Work Behind Analytics Success." The report’s authors make a clear call for taking a “both and” approach, noting the necessity of integrating, rather than balancing, analytics and intuition to support good decision-making.
Getting our heads straight about data.
So, how do we create a more harmonious marriage between intuition and data in our decision-making practices? A good place to start is building better intuition about data and analytics. Intuition doesn’t come out of nowhere, it’s built on our “past experiences and knowledge”. If we don’t have much experience effectively using data, our intuition or expectations related to it may be off-base.
The word hype in the title of the MIT report reflects this. Before I even cracked open the report, that title got me thinking about the movies based on Michael Lewis’s books. Not the books, but the movies. Let me explain.
I think we can get caught up in a fantasy about data that’s like being an actor in Money Ball or The Big Short. Like the actors, in this fantasy we don’t have to actually understand the data, or be able to do the math, or spend hours on trial and error in order to learn. Somehow, insights just jump right off the paper at us! We deliver our lines convincingly and we are heroes! Our efforts with data are…effortless.
Conversely, but concurrently, we also want to believe that effectively using data is mythically hard. I mean, if this stuff were easy, it wouldn’t require people like us…the smartest people in the room, would it? If it were easy, they wouldn’t make movies about it, right?
Such contradictory thinking may be part of the reason why the MIT researchers found that the share of organizations in their survey that are “analytically challenged” is going up, not down (49% in 2015 vs. 29% in 2012)... while at the same time, leaders continue to report they are optimistic about the promise of analytics. (The authors’ note that the optimism found in their study was higher amongst CEOs than general managers and that other studies have founder a greater degree of disillusionment.)
So, what is the reality?
In my experience, deriving worthwhile, if not stunning and game changing, value from analytics requires straightforward, sometimes hard and often unsexy effort across the organization. (1)
But don’t take my word for it.
Here are some real-world examples of analytics at work:
Boring, but important: Two of the cases highlighted in the MIT report are about creating data inventories of some kind. Having done it myself, I can tell you it’s tedious. It falls into the category of boring, but important. As such, many organizations may overlook it. It's hard to get people excited about “an inventory”. However, knowing what data you have and don’t have, or having a single, reliable data collection for a particular issue, can lead to better-informed leaders and decisions, for the organizations that create, maintain, and use them.
Focus on fundamentals: Harry Markopolos uncovered Bernie Madoff’s Ponzi scheme years before Madoff confessed to it. While Markopolos employed sophisticated analyses to “prove” Madoff’s fraud, he shared in an interview that he knew almost immediately it wasn’t for real: “Oh, in five minutes, I knew. I read his strategy statement and it was so poorly put together. His strategy, as depicted, would have trouble beating a zero return. And then his performance chart went up at a 45-degree line. That line doesn’t exist in finance; it only exists in geometry classes.” Most areas of business have fundamentals; ways that things work. If you understand how these fundamentals are reflected through data, you have a powerful complement to your intuition.
Story Time: It’s more than a numbers game. In a recent article in Tech Republic, one data scientist admitted, “Data scientists mostly just do arithmetic.” He goes on: “Most [businesses] just need good data and an understanding of what it means that is best gained using simple methods.” Rather than sophisticated algorithms, the article notes a challenge for many in the field is telling a compelling story. As Harry Markopolos discovered when his warnings about Madoff were ignored for nearly a decade, and as Richard Holbrooke lamented 40 years ago, sometimes it’s not what you know, but how you show it, that matters.
Yes, it IS your job: The MIT research indicates that analytically advanced organizations tend to be those where data is used at all levels of the organization, from the the C-suite to general managers to line employees. Additionally, a new report from Experian notes the importance of transitioning more responsibility for data management and governance to managers outside of IT, who may be better positioned to accurately articulate needs and regularly use data to improve organizational operations.
Where do we go from here?
I’ve never worked in an analytically advanced organization. As highlighted in the MIT research, few of us have. However, I don’t think that means that effective data use is out of reach.
In my experience leading implementation efforts of all kinds, taking simple steps to integrate data into strategy and project implementation translates into better and clearer understanding among decision-makers and, ultimately, improved outcomes. Truth be told, I’ve had the benefit of working with many sharp people, who have been generally, if not enthusiastically, supportive of efforts to integrate more data into decision-making at both the project and strategic levels.
Even if you find yourself working in a place that is antagonistic towards data, if YOU aren’t, there’s nothing stopping you from trying to improve your work through appropriate use of data.
Everything has to start somewhere, so why not here? Why not now? And, why not with you?
A place to start.
Below are some questions to help you reflect on your personal analytic practices. There are no "right" answers to these questions. Your answers should give you a sense of your tendencies, highlighting both strengths and potential gaps or "gray" areas that you can target for improvement. Remember, the right decisions don't always lie within the data; they don't always lie within your intuition either. It's best to get good at using both.
Do you know what relevant data are currently available to you and your team?
Do you use it?
Do your colleagues use it?
Do you trust it?
How do you see this demonstrated in your/their work?
Do you know the fundamentals of your business area?
Can you identify and/or explain key indicators?
Do you know what signifies normal/desired performance and what doesn’t? How?
Do you read available reports or research about the dynamics of your business area — do you find them helpful?
Is there anyone (else) on your team who knows/does any of the above?
How do you see this knowledge, or lack of it, impacting your work and/or the work of your team?
How do you make decisions?
Think back to a recent decision you made, or made with others. First, how would you rate the quality of your decision? Then, briefly map out the key steps in your process. What proportion of your process involved data? What proportion related to intuition, feeling or anecdotes from past experience?
When you need to make a decision, do you normally seek additional information? (From what sources?) Do you normally go with your gut or first impression?
If you asked your colleagues, staff or loved ones to describe your decision-making habits, what would their answer be? (Or better yet, ask them and listen to the answer!)
What are your expectations about data?
How do you define valuable analysis?, e.g., Does it surprise you? Does it confirm what you already know? Is it accurate?
When is time spent on analysis, not a value-add activity?
Identify one colleague or leader in your organization who you admire. What is their decision-making style?
Do you view understanding and use of data as part of your job responsibilities? Do you view it as someone else’s job – if so, who’s?
Notes
The MIT report provides a useful definition of analytics: “…the use of data and related business insights developed through applied analytical disciplines (for example, statistical, contextual, quantitative, predictive, cognitive, and other models) to drive fact-based planning, decisions, execution, management, measurement, and learning.”
References
Assay, Matt. "Why Data Science Is Just Grade School Math and Writing." TechRepublic. 4 Mar. 2016. Web. 25 Apr. 2016.
Holbrooke, Richard C. "The Smartest Man in the Room." Harper's Magazine. Web. 25 Apr. 2016.
"Madoff Whistleblower: SEC Failed To Do The Math." NPR. NPR, 2 Mar. 2010. Web. 24 Apr. 2016.
Ransbotham, S., Kiron, D., and Prentice, P.K., “Beyond the Hype: The Hard Work Behind Analytics Success,” MIT Sloan Management Review, March 2016.
Schutz, Thomas. 2016 Global Data Management Benchmark Report. Tech. Experian Data Quality, 17 Feb. 2016. Web. 24 Apr. 2016.