Let’s consider one of the most important concepts in creating a holistic data strategy: the data governance plan. You might take it for granted that the data your organization uses is essentially accurate and fit-for-use. If that’s true, congratulations: your company is an exception to the rule.
Our experience in the data management business has shown that almost every company has data issues, sometimes minor, sometimes severe — but in almost all cases, they’re issues that go neglected until catastrophe happens. The repercussions can range from unforced errors, such as inaccurate sales forecasts resulting from data redundancies, to public relations disasters caused by data breaches. If you know that you are circulating bad data through the system, and want to clean it up fast, our Data Remediation & Cleanup services can help immediately.
In the long term, the key is to get out in front of data issues before you’re facing a crisis, by having an effective data governance plan in place. The best way to create and implement such a plan is to establish an Office of Data (led by a Chief Data Officer, or CDO). Representing a new type of function within your organization, the Office’s purview would include data governance, as well as metadata management, data quality, data architecture and of course advanced analytics.
Here are eight common indicators that there may be trouble with your existing data governance plan.
Lots of investments in technology and staffing, but no tangible benefits. If you’re making investments in data management tools or capabilities (such as data profiling, matching, and “cleansing” tools, to name a few), but you’re neither improving your data nor reducing its redundancy, you’re also preventing yourself from reducing your firm’s fixed and variable cost structure. You end up burning through capital with no ROI to show for it; even worse, you may actually create more data redundancy, thus increasing associated costs.
Lots of new roles being assigned, without a clear purpose. In fact, companies often assign individuals to fill various roles, such as data stewards or data custodians, based only on their department, or some other almost random connection to the data they’re expected to manage. As a result, such individuals generally don’t have the foggiest idea what they are supposed to be doing. At the same time, their managers typically don’t have clear metrics for evaluating their performance. Although assigning roles may create the sense that ‘things are getting done,’ it’s actually just creating new costs without any measurable return.
No centralized master inventory of business data elements (BDEs). Without an inventory of the BDEs —the hundreds of specific types of data an organization uses and exactly where they live within its information systems — there’s no way to manage the associated metadata (that is, the “data about the data” that helps users find and use it more efficiently). More fundamentally, there’s no way of knowing exactly what data is being governed, nor how to identify and resolve redundant data elements.
No centralized master inventory of physical and logical data assets. If an organization’s IT function is not managing its databases, files, data stores, warehouses, Hadoop nodes/clusters, etc., it’s simply being negligent, and wasting resources (often at astronomical levels). More active management of these various data repositories and systems can greatly improve data access and quality, not to mention save significant money. When we see this problem, it can be a signal that IT may be protecting domains/kingdoms it established long ago — but which have outlived their usefulness.
No defined and managed metamodel owned by the enterprise data governance team. Without a metamodel, one doesn’t know what information to extract from the heads of the organization’s most knowledgeable experts in terms of the data they consume, manage, support, or create. This leads to data governance initiatives that simply drag on, with an ever-decreasing likelihood of completion. It also prevents you from realizing the true value of data reusability, access, and leverage. (We should add that if you don’t have a data governance team, you probably have even bigger problems.)
Data standards being written at too high a level. When standards and strategy are developed in terms that are too general, they can be impossible to enforce — and at the same time, create zero value. Standards should be value-based and at the level of a field in a specific database or column. If not, the organization has wasted its money. Even more dangerous, it may leave the organization’s data vulnerable to external threats if, for example, security standards are not made explicit. Tighter standards for data that is captured in the form of metadata can often help prevent data breaches or minimize their scope.
Your “data strategy” is a hodgepodge of loosely-related projects. Having abstract data strategies strung together with little more than project descriptions listed in a PowerPoint deck and an expense account is almost worse than having no strategy at all. Such an approach may allow a chief data officer to crow for a few quarters about having a data strategy … but if the projects aren’t part of an overarching, systematic plan, by years 2-3, it will become clear to the CEO/CFO that they haven’t executed or delivered anything of value.
Program leaders speaking in dated consulting lingo. Be very worried if the people overseeing your data quality are using buzzwords from the ‘80s & ‘90s — “managing multiple threads” comes to mind. The reality is that a data governance plan is in no way linear … so older, “traditional” management models not only do not work, but will very likely lead to incomplete results that fall short of promised benefits and ROI.
Of course, there are other clear signs that you need a better data governance plan — data breaches and major cost overruns are two examples — but as with many challenges in life, the most important step is recognizing that you have a problem in the first place.