Data architecture

Metrics for Data Architecture Effectiveness

By Kelle O'Neal

Does your data architecture practice work? Has it had an impact on your organization? How do you know?

Some people may believe data architecture is “just good hygiene” and that it “has to be done” because it is too conceptual for the impact to be measured. I disagree. It can and should be measured.

Data architecture as a practice is one of the manifestations of data strategy and is key to modern organizations executing digital strategies and being data-driven. Your data architecture can represent and support your “data asset accounting” approach and there is a solid set of metrics at your disposal for measuring effectiveness, value and efficiency.

Data Architecture Practice Areas

Data architecture includes a broad scope of practice areas that can include:

  • Data Environments
  • Data Layers
  • Data Movement
  • Data Stores
  • Data Integration
  • Data Standards
  • Data Model Standards
  • Conceptual Model Standards
  • Subject Area Models
  • Data Classification
  • Global Hierarchies
  • Business Views and Ontologies
  • Business Definitions and Other Metadata
  • Components and Services (including tools)

Measuring Data Architecture Capability

Implementing metrics to measure your data architecture capability across these practice areas is important because they help you establish a baseline, know where you’re going, align expectations, maintain relevance, define success/value, defend change and other requests, create an opportunity for feedback/engagement and answer questions around “why” and “what have you done for me lately?”

The key is to measure what matters. Focusing on progress metrics (tracking activity and status of the program) alone can prove to be overwhelming and ineffective. It’s important to also leverage impact metrics (measuring the value that has been provided) that are aligned to critical business goals and objectives and link them with progress metrics.

In order to determine the value you want to provide, you need to start by dissecting the issues to create your metrics. Start with the business challenge and then create the measurement and metrics that address the business need. Business challenges could be very tactical, like closing the books at the end of the month is inefficient and time-consuming because the data models that represent the multiple ERP systems don’t match. Or, they could be more aspirational like improving the ability to target new customers, which means the customer data needs to be accessible, consistent and trusted when combined across data stores.

The point is to clarify the issue, what is meant by the issue, why that issue is important and what is the change you’d like to see, i.e., the goal. Many times, just by clarifying “what you mean” and “why you care,” you can come up with a way to track a change over time or measure the result.

Measurement is also iterative. You will find that the more you know, you can adjust your metrics and measurements to become more precise or focus on different things to drive value. Instead of asking “How do I measure data lineage,” ask “What is the issue I’m trying to address.” Then you’ll be able to outline the variable and inputs and ultimately come to a way of identifying what can be measured and the metrics you’d like to use.

Aligning Metrics to Standards and Guidance

Once you’ve established progress and impact metrics, use them to drive behavior, as guiding principles articulate desired behavior.

Data architecture often has specific guiding principles, such as:

  • Avoid unnecessary data replication
  • Protect performance of Tier-1 applications
  • Use a standard set of data tools

Ensure that this guidance is reinforced via metrics:

  • Number of data replications retired
  • Reduced data storage cost
  • Percentage of data movement via standard tool (versus point-to-point data movement or other non-standard tooling)

Now let’s go through a process of how you calculate an efficiency metric or a productivity gain. Information quality and availability have a direct impact on productivity. The “dirtier” the data, the more effort is involved in manual data remediation. The less understood the data, the data locations, values and meaning, the more time spent trying to find the right data for the job, understanding what is needed to get access to the data and is it in a format that is consumable by your group.

Manual data remediation is the amount of time that an employee spends remediating the data, plus the opportunity cost of them not doing something else. Data access and retrieval is the employee effort, plus the opportunity cost, plus the potential system cost to create that access.

Project costs are employee costs, opportunity costs and system costs, plus the cost of having your consultants sitting around waiting for data and decisions around the data. Both of these impact project costs and project initiation costs.

Translating Data Value into Business Value

Metrics have no value if they aren’t aligned to the interests of a stakeholder, so keep these tips in mind:

  • Ensure there is some way of measuring how data-related improvements help stakeholders progress toward their goals.
  • Determine the critical information you need to track and measure to those goals.
  • Translate the value statement into the language of the recipient.

Communication is key to maintaining commitment. By leveraging essential metrics and communicating them broadly, the right metrics can ensure alignment and sustainability of your data architecture practice.

You have Successfully Subscribed!