Confusing data-driven when most of us are data-informed. Why it's okay to be both.
People talk about being data-driven, but what does it mean? How you can use the "If-then plan" to become action oriented.
Recently, I attended a conference where an interesting conversation occurred. During a discussion, someone presented an example of how their organization became data-driven. Over the last year, this company started surveying employees and reporting the results during their annual executive meeting. The executives found the data really interesting and were now incorporating the data. Exactly how, I’m not sure because no specifics weren’t given. But the person used this example to showcase how they’d become data-driven.
Quantitative data ≠ data-driven
The collection of quantitative data doesn’t make you or your organization automatically data-driven. In fact, I’d argue that quantitative data isn’t necessary to be data-driven. The more important piece easily forgotten when talking about data-driven is decision-making, specifically what actions are determined based on data. Too often, we focus our energy on the data collection but fail to create a formula for automated action we’d take once we see the data. We’re missing the “If then plan”.
IFTT is the easiest incarnation of data-driven because action (that) is taken based on the data (if this). The beauty of “If-then plan” is that it can work with qualitative data, quantitative data, anecdotal evidence, or intuition. Below are some hypothetical “If-then plans”.
If any of the users we interview express non-verbal interest (body leans in, asks for more), then I’d rank the problem as number 1 to solve.
If I feel nervous about the software quality, then I won’t sign off on the release even if it’s passed my acceptance testing.
If Jasmine tells me she found the feature not helpful, then I’d redo the analysis that the feature statistically significantly improves user engagement to re-confirm.
Notice in the examples, I didn’t use any quantitative data for my “this”, but I’m still data-driven because my actions are triggered by data. The data may not be statistically significant (e.g., a few user interviews, my intuition, anecdotal). You might disagree with me on the actions or triggers, but it’s predetermined and that’s what is really important about data-driven.
Why focus on pre-determined actions?
I have observed several issues with data-driven when the focus is primarily on the data instead of action. Now, I’m not arguing that collecting quantitative data is unimportant. I would be standing on quicksand to argue that point given the proliferation of data, data collection, and data analytics. It’s now easier than ever. However:
Collecting data can still be hard and tedious work. Even if hooking up the actual data collection is easy, gathering data may take a significant amount of time if you don’t have a lot of users or usage.
Collecting the “correct” data can also take some trials. What you thought was the most important data may not be what you settle upon in the end. This eats up more time and effort.
Analyzing and interpreting data. Did I mention effort and time? Also, make sure you’re doing it correctly, which is not also so easy.
Data collected without a plan can result in interpretation arguments, which can lead to analysis-paralysis or some outride lies.
To illustrate an example, suppose you have one north start metric and its "task completion rate” (calculated as total task completed / total task started). If this metric drops from 35% to 30%, what action would you take assuming it isn’t instrumentation, technical, and seasonality?
Most PMs would talk about an investigation If-then plan. If the metric drops by 5%, I would investigate by:
Try to segment the data to see what type of user (e.g., by acquisition channel, by product usage, by new vs returning users) has the largest drop-off.
Formulate a hypothesis that explains the drop (e.g., H1: High task completion rate if the task can be completed with fewer steps)
Conduct user research to understand why this user segment had a lower metric.
But this isn’t immediately actionable for the product. A better If-then would be (i.e. if metrics drop by 5%, I’d re-run the last A/B feature experiment.) This is where you’d want to think through some hypotheticals in order to formulate several If-then plans.
Be data-driven AND data-informed.
It’s common to think of data-driven as better than data-informed. It probably makes you sound smarter to say “I’m data-driven” rather than “I’m data-informed”. But I think this dichotomy is a false one and part of business-speak.
First, most of us are data-informed, not data-driven. Data-informed means we gather a bunch of data, qualitative and quantitative. Then, we try to make sense of that data with our past experiences, biases, and intuition before making a decision. One decision method is called satisficing when facing multiple choices.
Satisficing is a common method used in which humans do not attempt to conceive all of the potential choices but then make a decision on a choice that is “good enough”. In deciding to take an alternate flight route around weather, a pilot will not outline every possible flight plan and then evaluate them all one by one. He or she will likely find a route with minimal deviations from the original flight route that is not impacted by the weather, even though this choice may not end up being the “optimal” flight route.
For experts, you might use a method called recognition-primed decision-making.
Recognition-primed decision making is built on experts’ mental models of the systems with which they have extensive experience. These mental models must be simple enough to allow the expert to run mental simulations with different inputs quickly; however, they must be elaborate enough to adequately describe the complexity of their particular system. These models must also be able to be adapted to analogous situations. For example, a pilot who has experience in a Boeing aircraft may be able to use his or her mental model to predict how an Airbus aircraft would react to a control input. (In some cases, this mental model would be more successful in correct prediction than others.) With experience, an expert learns which pieces of the mental model are critical to correct prediction and which can be dropped to reduce cognitive workload.
The point is that data-informed doesn’t pre-determine action. But in data-driven, the goal is to automate decision-making. There’s a place for both as a product manager. The question you should ask yourself: “Is the decision worth automating with a formula.”
Like this article? Leave a comment.
Additional Reading
Decision Making Under Uncertainty (computational models), but page 291 starts covering “naturalistic decision-making”
How to Make Yourself Work When You Just Don’t Want To and If Then Plan
Lying with Stats: Everything you thought you knew about statistics