This article was originally published on Intellyx.com
Data is not as objective as you might think, particularly in the hands of those inexperienced in analyzing, manipulating, and leveraging it effectively. The growing pressure on organizations to use data and AI as competitive weapons increases the risk of misuse and misapplication and can lead to catastrophic results.
We are entering the golden age of data.
From baseball to marketing to personal health, we are collecting more data than ever before, collecting it about nearly everything imaginable, and, most importantly, we are using it in ways that were inconceivable just a few short years ago.
Data promises to be a new form of currency, enabling entirely new business models and powering the evolution to an intelligent world powered by artificial intelligence (AI) and related technologies.
But beneath this gleaming promise of a data-driven future full of algorithms, predictive technologies, and hyper-personalized experiences, lies a darker underbelly.
Data is not quite as objective as you might think, particularly in the hands of those inexperienced in analyzing, manipulating, and leveraging it effectively.
When coupled with a growing pressure to use data and AI as competitive weapons, the risk of the misuse and misapplication of data leading to catastrophic results grows exponentially.
Wanna make this a regular thing?
If you're enjoying these ideas and insights, why don't we make this a regular thing? Follow my work on what it means to be a leader in the Digital Era, by subscribing to The Practice by The MAPS Institute, which I co-founded with my wife, Laura Araujo to explore how we can all life a life in balance.
Every enterprise leader must, therefore, come to recognize not only the great promise of becoming a data-driven enterprise, but also the enormous risks and responsibilities that come with this transformation.
The Danger of the Objectiveness Myth
A generation of executives following Peter Drucker’s exhortation that “you can’t manage what you don’t measure mastered the art of collecting — and using — data. And they used it to deftly and repeatedly beat the intuition and guile of the previous generation of executives.
It was, in fact, the objectiveness of this data that created the advantage. Rather than relying on the experience and instinct of the rarefied few, data enabled organizations to make better, faster decisions based on what the numbers told them.
As business processes became increasingly digitized, they generated a treasure trove of data that organizations could use to transform how they functioned and made decisions. These original applications are now giving to way to more advanced predictive capabilities using AI-powered technologies — all based on the idea that data will lead the way.
Except, this data and how organizations apply it may not be as objective as we think.
A recent Wharton School of Business podcast and article entitled What Marketers Are Doing Wrong in Data Analytics examined this issue.
The article reported on research completed by Wharton professors that analyzed over 2,100 marketing experiments conducted to test various elements of web pages and other marketing material using so-called A/B tests in which some visitors saw one version while others saw another in order to test which leads to better outcomes.
The researchers found that “57% of marketers are incorrectly crunching the data and potentially getting the wrong answer — and perhaps costing companies a lot of money.”
The data, it would seem, was not quite as objective as the marketers or their companies thought.
The Talent and Understanding Gap
In my recent book entitled Performance-driven IT: How Metrics can Transform IT Services and Operations, I make the case for why IT organizations must, in fact, move beyond the hero culture of the past and adopt data-driven management approaches.
But I also cautioned IT leaders about the risk of data manipulation — whether intentional or not. As the saying goes, “figures lie, and liars figure.”
The point is that using data to make decisions and take actions is incredibly powerful, but also can be very dangerous — particularly when in novice hands.
When looking deeper into their findings, the researchers found that the results were often wrong because marketers were stopping their experiments too quickly. They further discovered that there were two primary reasons they did so.
The first reason was just that they lacked the experience and statistical expertise to interpret what the data was telling them.
Relying on the statistical significance reported by a testing system and using it as a form of confidence rating, these inexperienced marketers (from a data perspective) were often making assumptions that there was a significant difference in results between two options when the researchers found that there was, in fact, no difference whatsoever.
The researchers pointed out that the impact of these misreadings may have led to significant missed opportunities and, in some cases, significant financial and human resource investments to capture non-existent advantages.
As organizations look to apply data as an objective decision-making tool far and deep within their organizations, the potential ramifications of this type of misinterpretation grow exponentially more impactful.
The big data and AI industry sectors have recognized for some time that the lack of qualified data scientists represents a significant roadblock to widespread adoption of these technologies. As a result, countless technology companies are attempting to use automation and AI-based approaches to enable business and other non-technical users — so-called citizen data scientists — to analyze data, glean insights from it, and take action based upon those insights.
While these technologies may help organizations deal with the mechanics of data preparation and analysis, they may actually make the situation worse if they create a false sense of confidence in the objectivity of the data without closing the understanding gap that will enable organizations to test the veracity of those results.
The Biggest Risk: Bias
The second reason that the marketers stopped their experiments too early, leading to false insights, was more insidious.
The researchers found that many of the marketers were under significant pressure to produce results and demonstrate that the experiments and their resulting data would lead to some form of a breakthrough. After all, it would be difficult to sell a client or an internal customer on the power of A/B testing if the testing showed that it made no difference at all.
This need for the data to reveal some heretofore unknown insight created a pressure on the experimenters to demonstrate a positive outcome. They, therefore, were all too eager to latch on to anything that would do so and would often stop experiments as soon as they got a result they wanted.
While these type of marketing experiments were valuable and had meaningful business impact, they were also relatively insignificant in the grand scheme of the organization’s go-to-market efforts. They were just one small element of a much larger marketing strategy.
This is not the case when it comes to large-scale deployments of big data, analytics, and AI technologies in today’s enterprise. In these cases, the stakes are significantly higher.
As a result, the pressure that exists on data, analytics, and AI teams is exponentially higher than that placed on the marketers — and it will only increase as the investments continue to pile up.
The great question, therefore, will be the impact this pressure will have on how these teams apply and interpret data and the results that their algorithms and AI models produce.
While there has been ample talk about the risks of societal and other forms of bias in AI technologies, the greatest bias of all may be to find insights and predictive outcomes in data via algorithms and AI applications when none, in fact, exist.
The Intellyx Take
As an enterprise leader, the risks of misinterpretation and misapplication of data may be disheartening. The point of this article, however, is not to discourage or dissuade you in seeking out every opportunity to use data to transform your organization.
The point of these cautionary notes is to encourage you to reshape the very nature of your organization by putting data at the core of your business and operational models — but to do so with your eyes wide-open.
There is a misnomer in the industry that those with the most data will win. It will not be anywhere near that simple.
It will not be enough to merely have data. Instead, competitive advantage will go to those organizations that can use it most effectively — and that will begin by ensuring that they are interpreting it correctly. Just putting data in the hands of your employees will not do the trick — and it’s possible that doing so might spell disaster.
Copyright © Intellyx LLC. Intellyx publishes the Agile Digital Transformation Roadmap poster, advises companies on their digital transformation initiatives, and helps vendors communicate their agility stories. As of the time of writing, none of the organizations mentioned in this article are Intellyx customers.