• X

    Latest News

    What the Sinking of the Titanic Teaches Us About Data Assets and Reflexivity

    I rarely start out sentences with “It’s sort of like the sinking of the Titanic,” … but for this topic, I’ll make an exception. The point I want to make is that from a data quality perspective, what actually sunk the famous vessel was not so much the iceberg, but rather Captain Edward Smith’s management preferences and reliance on incomplete data — and the effect of reflexivity.

    In this way, Captain Smith is not entirely unlike the CEOs of many publicly-traded organizations today. The difference is that instead of mishandling data about icebergs, they may be mishandling a variety of key data assets and organizational metrics. The bottom line is that, like the good captain, they’re not effectively managing risk.

    That sinking feeling

    In Smith’s case, his main responsibility was to cross the Atlantic without sinking. The CEOs’ responsibility, in contrast, is to report every 90 days on the state of their organization’s financial performance. If there is any discrepancy between what is considered to be true and actual reality, they are held criminally responsible — and that is truly a sinking feeling.

    What could cause such discrepancies? After all, the CEO may perceive that the data and operational metrics they use to make decisions are sound, accurate, and trustworthy. But I learned another perspective firsthand over the course of more than ten years of working intimately with data used to make decisions about cash flow, reserve projections, and other key decisions. I learned that perception about data is rarely the exact truth. As with most things, perception about an organization’s data is the reality we gladly accept — but sometimes it is not the same as actual reality.

    In fact, history books are full of times when this dynamic played out on macro scales — creating asset bubbles in everything from the 16th-century international tulip market to the real estate bubble of the previous decade. In each of these cases, disaster was preceded by blissful exuberance. In addition, perceptions became increasingly disconnected from reality, such that the perceived truth became the market value, despite the much lower intrinsic value.

    Many people have written about this phenomenon of diverging perceptions, but none more comprehensively than famed hedge fund manager George Soros. In fact, he built much of his management and investment philosophy on the certainty that the influence of reflexivity can and will cause perceptions to deviate from reality (and when it does, chaos is sure to follow).

    Different views of the same data can’t all be right

    I propose that the same dynamic occurs when data is not managed as professionally as financial assets. Here’s why. Each community of data users naturally sees the data through their own lens. For example, users in the business units focus on their process and activities that directly support the Profit & Loss, balance sheet, and cash flow. IT staff, meanwhile, tend to focus on the tools, systems, code, and technologies that they either find interesting or must support. The list goes on. As these groups work with each other around shared data, they create feedback loops — perspectives on the data that tend to self-reinforce the prevailing view.

    But there is often also a minority of individuals among the data’s users whose opinion is all but ignored, even though they may be the only ones who understand the real intrinsic value of the data assets in question. No group has a monopoly on this more realistic view of the data — it could include business users, technology staff, or others. The more important point is that people tend to not want to hear about such fundamental problems in an organization’s data, even as reflexivity continues to drive the various views further and further apart. Eventually the minority will be proven right — but perhaps only after the iceberg has been hit.

    All of these slightly different biases being imposed on the organization’s central data assets by each community of data users will ultimately result in multiple opinions, views, and positions on that data. What’s more, it will create a sense of chaos, confusion, and lack of clarity as to what certain data elements even mean across the organization.

    In the case of the Titanic, one of the fundamental flaws in Captain Smith’s approach to decision-making was that he preferred actual sightings from the crow’s nest over any other type of input regarding nautical hazards. As a result, his staff essentially ignored the credible telegraph messages they actually received from other ships that had already encountered ice.

    At the same time, the data that the captain did receive from the crow’s nest was wrong — partly because there was no moon to improve the lookouts’ ability to spot distant objects, and partly because the still sea wasn’t creating visible breakers around the bergs. In any case, the lookouts’ failure to see the iceberg reinforced the captain’s existing view that conditions were fine — and onward the great ship went.

    Which truth is true?

    In the case of an organization, which of the many possible views of the data should a CEO embrace as the accurate one? Fortunately, organizations can use the disciplines of data and metadata management to answer this question, by offering proven ways to drill into one’s data, understand where it may be faulty, and know how to fix the problems.

    Not to sound discouraging, but tackling the issue of reflexivity around data assets is a formidable challenge. If your organization hasn’t done so already, you should probably start by taking a cold, hard look at your data governance plan. Next, you should ensure that you have systems and processes for effectively collecting and managing metadata. There’s also the issue of data lineage — that is, how data moves through your organization from the point it’s created or acquired until it’s consumed.

    Of course, that’s a very high-level summary of a very complex process. Still, taking on the challenge of data and metadata management should be a high priority, in my view. Because, if the very people who on a day-to-day basis must support, manage, create, and leverage the organization’s core data assets have significant confusion about them, it seems unlikely that their CEO will be able, with empirical certainty, to support the various decisions they make. The captain of the enterprise may not literally have to go down with the ship — but he or she could still have to pay a heavy price when the data asset bubble bursts.


    Submit a Comment

    Your email address will not be published. Required fields are marked *