cio_auhace 65d
Data management is one of the most important functions of IT. It helps ensure the organization’s data is accurate, coherent, secure, and accessible to the users who need it, and can enhance decision-making, efficiency, and compliance with data privacy regulations.Indeed, without data management, enterprises that strive to be digital businesses might lack a reliable foundation for success. That said, like most areas of business, data management strategies, techniques, and technologies are in constant flux. Here’s a look the current trends in data management, including what’s in and what’s out.In: Real-time data mastering to support ongoing operationsData provided to users needs to be fresh to be useful. This is especially true in sectors subject to rapid change.“When we’re matching clinicians to open roles, data can’t be stale,” says Taner Maia, senior product manager at CHG Healthcare, a provider of healthcare staffing services. “Provider licenses change, credentials get updated, and availability shifts constantly. We’re seeing a big move toward real-time architectures that surface the most current data to the teams who need it.”Before CHG modernized its approach to data management, different divisions maintained their own systems, and staff often had to ask providers for the same information multiple times. “Records were inconsistent, and identifying duplicates across systems was a tedious, manual process,” Maia says.Today, real-time access to unified provider data, through a solution from Tamr, has reduced duplicate record creation, improved the sourcing of new providers, and made it easier for teams to see the full context of a provider before engaging with them, Maia says. “It’s become a requirement for how we operate and helps us match the best provider to each client’s needs,” he says.In: Data as a productData as a product is an approach where enterprises treat data as a shareable, valuable product with clear ownership and documentation. Rather than being considered as raw data, it’s looked at as a reusable, accessible asset that can solve business problems.“Treating data as a product puts everybody in the workflow, working within the same ecosystem, and it’s one of the key things if you want to scale analytics,” says Roman Rylko, CTO at software development and consulting firm Pynest.“Our teams were used to maintaining separate copies for research and production, but it gets you trapped really soon,” Rylko says. “Then you’re running around trying to consolidate warehouses and feature stores, hoping you get in time to get the software out. The duplication cost us money, so for most of the projects we just moved to a single, governed lakehouse with a unified catalog.”Now users pull whatever data they need, in whatever format, and can share it across teams. “Now we have less fires to put out; our analysts move faster,” Rylko says.In: Data quality as a strategic assetEnterprises continue to gather enormous volumes of data, and oftentimes some of this data is inaccurate, out of date, duplicate, inconsistent, or irrelevant. This data can lead to poor decision-making, inferior customer service and support, and even lost revenue.While companies have been using tools such as data cleansing for years, they might not have treated data quality as a strategic asset that must be constantly maintained.“Bad data undermines everything,” says William McKnight, president of McKnight Consulting Group. “AI and analytics only work well if the underlying data is accurate, relevant, and well-curated. Targeted efforts on high-impact datasets yield faster, more reliable results.”Enterprises need to focus on cleansing, validating, enriching, and monitoring critical datasets — small, iterative improvements rather than trying to fix everything at once, McKnight says.In: Data lakehousesThe data lakehouse, a data architecture that combines the flexibility and efficiency of a data lake with the management and performanceof a data warehouse, is on the rise. Enterprises can use data lakehouses to store and analyze various types of data, including structured, semi-structured, and unstructured data.The global data lakehouse market was estimated at $11.35 billion in 2024 and is projected to reach $74.0 billion by 2033, according to a report from Grand View Research. The market’s growth has been driven by rising demand for unified data platforms that combine scalability with structure and performance to support advanced analytics and AI workloads, the report says.“Data approaches that make the picture simpler and more transparent are currently winning,” Pynest’s Rylko says. “Instead of multiple disparate data warehouses, companies are moving to a clear lakehouse architecture. Teams have clear data contracts, linear catalogs, and automated quality checks and monitoring.”In: Governance that supports AI use casesData governance has been “in” for years. But in addition to ensuring data quality, security, privacy, and other data-related essentials, governance now takes on the role of ensuring AI outputs can be trusted.“Good governance is no longer just about compliance — it’s about enabling AI to generate trustworthy insights,” McKnight says. “Structured data, clear ownership, and transparent lineage build confidence in results.”Technology leaders need to focus on metadata management, data stewardship, lineage tracking, and clearly defined roles for managing AI-ready data, McKnight says.Leading enterprises are implementing structured knowledge frameworks that encode business rules, product relationships, and compliance requirements into a semantic layer, says Sanjeev Mohan, principal at advisory firm SanjMo. “These guardrails enable autonomous operation while preventing costly errors,” he says. “CIOs who adopt this will report fewer AI mistakes requiring human intervention.”Out: Mass AI deployment without prioritizationThere are any number of reasons why it’s not a good idea to deploy AI broadly and quickly within an organization.For one thing, a mass rollout of AI without careful thought can result in significant ethical and practical concerns, including the greater likelihood of unintentional bias and discrimination; the risk of taking humans out of decision-making too soon in the process; and operational failures.There are also data security and privacy risks, and the lack of sufficient skills to achieve intended goals with AI. And of course, there’s the problem of using AI where it isn’t really needed or doesn’t belong. All of this can result in lots of projects that don’t deliver value or fail completely.“Flooding AI with all enterprise data wastes resources and reduces trust,” McKnight says. “Selective, high-quality datasets produce better outcomes.”Out: Rigid, monolithic platformsThe ability to adapt quickly to change is essential in today’s data management environment. The rise of AI has made agility an even more important trait than in the past.“Stacks that can’t adapt quickly to new AI models and frameworks become obsolete fast,” McKnight says. “Flexibility is essential. AI models and tools evolve rapidly. Data platforms must be able to plug into multiple AI frameworks without being locked into one vendor or rigid architecture.”Centralized data warehouses driven by the goal to consolidate data “are no longer a dominant trend, being replaced by more hybrid, platform-oriented approaches — such as data fabrics, lakehouses, and edge processing — that have a clear connection to [return on investment] and business use cases,” says Orla Daly, CIO at Skillsoft, a provider of technology training services and products.“At an organizational level, this is driving a move towards hybrid operating models with some key responsibilities, such as governance, remaining centralized,” Daly says. “This shift supports real-time analytics and fit-for-use architecture to support AI-driven workloads, while maintaining governance.”Out: Late data cleanupsRather than handling data quality issues late in the processes, enterprises are modernizing their data strategies and shifting toward performing data cleansing earlier to increase efficiency.“Before we modernized our approach, a lot of our data quality work happened late in the process, after records had already been created and used across different teams,” CHG’s Maia says. “With fragmented systems and inconsistent provider information, issues often had to be fixed manually and usually by the same people.” This created delays and extra steps for teams that were trying to move quickly.“What we’ve seen is that this kind of after-the-fact cleanup doesn’t scale,” Maia says. “You get better outcomes when data issues can be caught earlier and resolved closer to where the data is created or used.”What might help is AI-powered data quality monitoring, which is on the rise. “Poor data quality used to slow projects,” says Kelly Raskovich, senior manager and lead within Deloitte’s Office of the CTO. “Now it compounds through AI systems, creating cascading errors.”As agents work, they generate data about their decisions and outcomes, Raskovich says. “This ‘digital exhaust’ from the silicon workforce becomes valuable for continuous improvement,” she says. “Instead of quarterly audits, organizations are using AI to monitor data quality and capture these insights in real-time.”Out: DIY master data managementIt might be tempting for enterprises to look for savings by going the do-it-yourself (DIY) route for master data management. But this can lead to problems down the road.“We previously tried building our own data mastering solution at CHG,” Maia says. “It seemed reasonable at the time, but it quickly became clear that it wasn’t going to scale with the volume and complexity of our provider data. In addition, the skill sets required to maintain and evolve it were too specialized, and the operational cost was high.”For organizations dealing with fast-changing, business-critical data, DIY approaches are becoming harder to justify, Maia says. “Modern data environments evolve too quickly for internal builds to keep pace, both in terms of scale and cost.”Out: Pre-AI systems and practicesMany of the data management systems in place were likely deployed before AI had a major role in enterprise IT. That means it might be time for an update.“A number of legacy practices are rapidly becoming obsolete, and enterprises that continue operating with human-first, governance-lite systems will struggle to scale AI beyond prototypes,” says Larissa Schneider, COO and co-founder of Unframe AI, which leverages large language models to create software products.“Batch-only pipelines are simply too slow for AI-driven decisioning, which requires continuous, real-time context,” Schneider says. “Rip-and-replace modernization projects are also falling away, as enterprises no longer accept multi-year migrations. Tool sprawl without intelligence is reaching its end, with organizations consolidating around fewer platforms that embed AI directly into operations.”