contact us
 

Incremental Data Maturity: An Evolutionary Framework for Steering Data and AI Investments

Jan 28, 2025 | min read
By

Ricardo Mendes

Navigating the Data and AI landscape is challenging. With a vast array of tools, vendors, and knowledge areas, long-term investments are rare in today’s rapidly evolving world, leaving companies struggling to become truly data-driven. This is where an incremental approach to leveraging Data and AI makes all the difference. In this article, we share insights into a mindset empirically developed over six years of consulting for US and Brazilian companies.

What if we could start small and improve on demand? Sounds reasonable, doesn’t it? The following paragraphs provide insights on how to achieve this, introducing a customizable framework that can adapted to different enterprise needs.

Initial Assessment

The Cambridge Dictionary defines "incremental" as something that happens gradually, in a series of small amounts. The first step is determining what can be incrementally improved over time—let's call these the pillars. It’s important to note that the framework implementation should be led by a group of individuals committed (or at least intended) to improving the company's Data and AI standards.

From experience, Data and AI teams recurrently discuss the following topics: Strategy, Storage, Governance, Analytics, and Engineering. These will serve as the five pillars for this conversation, but you may choose to use more, fewer, or entirely different ones when adopting the framework. Start by identifying the primary Data and AI-related topics discussed at your company, but avoid an extensive list initially — you can expand it later.

The second step is to assess the company's maturity level for each pillar. A simple online survey with an appropriate audience can be effective. By "appropriate audience,” we mean individuals connected to Data and AI initiatives. Including a variety of roles and levels is essential to avoid bias and ensure reliable results.

What might such a survey look like? The questions should be directly related to the pillars, with answers that provide measurable insights. Use the following form as a reference:

Image 1. Sample Data Maturity Survey Form — Image by CI&T

Determine thresholds to discard outliers from the responses before considering any item as checked. For example, state that at least 30% of the responses must be TRUE for a given item (e.g., Data Catalog) to be considered available at the company.

Next, evaluate the responses and number of items available for each pillar. The result can be visualized using a radar chart, effectively highlighting the company's strengths and weaknesses.

Image 2. Initial Data Maturity Assessment — Image by CI&T

Cycle 1

The group responsible for enhancing the company’s Data and AI standards will then review the results internally and collaborate with other stakeholders. Together, they will prioritize areas for improvement to advance to the next levels.

The image below illustrates the outcome of such a prioritization discussion. It indicates that the company will focus on Data Governance and Data Engineering during Cycle 1 (the duration of the cycle, such as a quarter, can be tailored to fit each company’s needs).

Image 3. Planning the first iteration cycle — Image by CI&T

On the Governance side, the initial assessment showed the company lacks Data Ownership and Quality capabilities. Since the goal is to increase Governance from level 1 to 2, the recommendation is to prioritize and address a single topic during this cycle.

Regarding Data Engineering, the assessment shows no Infrastructure as Code in place. Introducing this capability should be the focus to address this gap.

At the end of the cycle, the radar chart should look like this:

Image 4. Finishing the first iteration cycle — Image by CI&T

Cycle 2

Notice that Data Engineering reached level 3. The main items are in place in our fictitious scenario, but given its importance to the company, the group may decide to break it down further and dive into more details during Cycle 2. A second survey targeting Data Engineers can help clarify how they perceive the adoption of the engineering practices, as illustrated below.

Image 5. Sample Data Engineering Maturity Survey Form — Image by CI&T

The possible answers to this survey will likely differ from the initial assessment. This is because, rather than counting how many items are available under each pillar, we aim to understand how specific individuals (Data Engineers, in this case) perceive each item. Instead of True or False, values such as Not at all (0), POC / Pilot stage (1), Yes, but needs improvement (2), and Yes, it's good enough (3) are more appropriate.

Using the survey results, the framework implementation leadership group can start planning the second cycle: improving Coding Standards from Yes, but needs improvement (2) to Yes, it's good enough (3), CI / CD from POC / Pilot stage (1) to Yes, but needs improvement (2. The same applies for IaC.

Additionally, they also recognized it’s time to formalize a Data Strategy Steering Committee to sustain the changes over the long term, thereby moving Data Strategy from level 1 to 2.

Image 6. Planning the second iteration cycle — Image by CI&T

By the end of Cycle 2, the maturity levels should be as shown below:

Image 7. Finishing the second iteration cycle — Image by CI&T

Cycle 3

With suitable Strategy and Engineering practices in place, the Steering Committee agrees to improve the Data Governance standards in Cycle 3. Following what was described in the previous cycle, they will leverage this opportunity to split the items to make the intent more transparent and aligned with the team's needs.

Image 8. Planning the third iteration cycle — Image by CI&T

Cycle N

We hope the previous sections were detailed enough to explain the proposed framework's incremental nature. N subsequent cycles will follow this process, gradually improving the maturity level.

Beyond splitting the pillars initially considered, the Steering Committee can add new ones to the radar. Given its current momentum, Artificial Intelligence is a good choice once the company has established fundamental data capabilities. How can data teams leverage AI tools to boost productivity? Should they build their own agents or adopt third-party tools? Is the company's data suitable for training AI models? These questions demand thoughtful discussion, and the incremental approach outlined in this work can help address them effectively.

As new data points are added to the chart, a question will inevitably arise: How can the increasing complexity of visualization be managed? There will be too many items, differing value scales, and other challenges.

To address this, the radar will need to be split at some point. Image 9 illustrates a suggestion to address the inclusion of the AI pillar and new radars for Governance and Engineering when planning a future iteration. Note that items previously belonging to the split pillars have been removed from the main chart, simplifying its visualization. Their data now appear on the right, forming a dedicated dashboard.

Image 9. The Data Maturity Assessment Dashboard — Image by CI&T

Closing Thoughts

As you can see, complexity grows as the Steering Committee becomes more familiar with the framework and learns to address various area needs more effectively. This progression helps companies thrive in their Data and AI journeys. Many other questions will undoubtedly arise during the process, such as:

When should we reassess the pillars and items? After splitting a pillar? When finishing a cycle?

Can metrics other than survey results (e.g., time to insight and cost reduction) be used to evaluate the framework results?
With experience and strategic support, the committee will be well-equipped to answer these questions and customize the framework to manage changes and improvements effectively. To find out how to further maximize the value of your data, get in touch here.


Ricardo Mendes

Ricardo Mendes

Principal Data Consultant