• X

    Latest News

    How to Determine the Business Value of Data

    Every executive has heard countless vendors say that data is a valuable asset. But exactly how can one know the actual business value of data? If you press salespeople on this point, they will likely admit that it is just rhetoric; of course one can’t really put a dollar value on an organization’s data.

    But the question persists. How much is data worth? Most people tasked with overseeing their organization’s data know that data has value, but can that value be expressed in financial terms? If so, what would that formula look like? Accounting firms have time-tested methods for conducting business valuation analyses for various types of corporate assets, including seemingly nebulous brands, so why couldn’t the same be done to find the business value of data?

    Fortunately, the answer is that such a valuation can be done. The process is somewhat complicated, but here’s how it works at a high level: The first step is organizational – it starts by establishing an Office of Data, led by a Chief Data Officer tasked with data governance, metadata management, data quality and data architecture — responsibilities that are typically spread across various corporate functions. Because data is a valuable commodity within the organization, the organization needs an Office of Data to serve the strategic role as the “broker” of data between creators and users.

    Download our Free Guide: Leveraging the Power of Infonomics to Drive Financial and Strategic Benefits

    Next, you need to create a comprehensive inventory of the hundreds or thousands of business data elements (BDEs) distributed throughout the organization, and start acquiring the metadata about how each type of data is created and used. Then you need to select individual data stewards responsible for specific chunks of data, and coordinate their efforts to monitor and improve the quality of the data while tracking progress using key metrics.

    The critical next step is to develop a standard method for quantifying how the data elements of the organization exist within the organization. This metric must include the ‘tribal knowledge’ (aka people-based metadata), the frequency of use/demand on the data element, and the overall quality of the data element. We call this metric the Data Certification Score, which is a holistic indicator that measures how the data is used based on both quantitative and qualitative inputs.

    By analyzing the Data Certification Score for each of the physical instances of the data element, you create a comprehensive, enterprise-level view of how the BDE lives in the organization. The Data Certification Scores, in turn, can be rolled up into a central view that quantifies the data element’s ‘pulse’ – that is, an indicator of the use of each element by the community of creators, consumers, stewards and custodians across the organization.

    Once the data certification scores are established for the most important BDEs of an organization, it is important to establish monitoring mechanisms that will recalculate data certification on some time frequency. Typically, the data community will establish that time frequency, and it will be closely related to the demand placed on the data. For instance, if a data element is a critical input to a monthly financial process, then the data certification score might be re-calculated at least monthly and preferably, weekly.

    Once Data Certification Scores are calculated, you can use statistical methods to correlate them with the key operational or financial KPIs/metrics of the organization. A significant correlation provides the empirical evidence of value. This final calculation is captured as metadata and is used to create the balance sheet of data – a visualization of the relative value of data elements — either from a revenue, cost or efficiency perspective.

    Finding correlations between data and value

    This is where the concept gets very interesting. By tracking the Data Certification Scores on a monthly, bi-monthly or quarterly basis, you can start to directly correlate individual data elements to your Profit & Loss statement or other key management reports used to run the business. This means that you can uncover distinct, reliable relationships between specific data elements and essential indicators and metrics such as revenue and costs.

    The advantage of sophisticated statistical analysis is that you can uncover obscure, but important relationships – perhaps customers in a certain geographical area call customer service on Tuesdays, or that customers that drive blue cars are more likely to want upgrades. Not only can you uncover obscure but valuable relationships, you can put a dollar figure on known relationships.

    You know that improving email addresses is worth something, but is it worth 3 cents per address, or 3 dollars? Our quantitative method can help you to find the answer. By conducting some statistical analysis, you can see, in real time, exactly where and how your data is adding to (or in other cases, hurting) your bottom line — and thus determine the business value of data.

    How will this insight help you? Imagine that you implement the system described here, and in the process, you determine that five specific data elements have a very strong correlation with revenue. Now you know precisely where to start investments in improving data quality for optimal ROI.

    Attaching a reliable figure to the value of data is an invaluable way to get and maintain executive backing. Naturally, you would start any data initiative with the highest-value data elements in order to get some quick wins that prove the concept and generate momentum.

    Put this approach in contrast with comprehensive, multi-year data initiatives that are bound to bog down due to changes in business strategy, personnel, new technology, budgeting priorities or any of the usual suspects that hinder slow-moving processes. This approach hones in on the data that is driving your business, and optimizes that data for maximum business impact. Furthermore, this approach works well in hybrid environments that include both legacy and new data storage technologies.

    Keep in mind that this is a new way of looking at the data that you’re already using to support your operational model. But this approach allows you to predict that if you improve the Data Certification Scores for specific BDEs, you can expect profitability should increase by a defined amount over a given period of time. This approach takes the traditional accounting process of business valuation, and applies it to a new kind of corporate asset: data. Until now, the value of data has essentially been a black box.

    Best of all, this is not a theoretical valuation model. It’s practical and it works. It is currently helping organizations gain greater insight into which components of their data actually drive business value and which do not. It is providing reliable forecasts of ROI from data quality and data governance initiatives. The relationship between data quality and business metrics is too important to leave to guesswork. By implementing the process described here, you can begin to harness the power of data for your business.

    VOY-guideoffer-022916

    2 Comments

    1. Creating a Data Quality Plan that Drives Actual Improvements to Data | Says :
      Posted on September 23, 2015 at 3:11 pm

      […] or CDO), representing a new type of function within your organization. Because data is such a valuable commodity, your organization needs an Office of Data to serve an essential, strategic role as the […]

    2. What’s the Extent of Your Organization’s Information Risk? | Says :
      Posted on December 16, 2015 at 6:22 pm

      […] is possible — one just needs an analytical method and process. As we’ve discussed in an earlier post, such a process starts by creating a standard Data Certification Score (DCS), a holistic indicator […]

    Submit a Comment

    Your email address will not be published. Required fields are marked *