Four emerging data integration trends to assess

Data integration is becoming increasingly important to the ability of companies to attract, serve and retain their customers. Enterprises face increasing data integration challenges, mainly due to growing data volumes, compliance pressures, the need for real-time information, increased data complexity, and data distribution across hybrid and multiple clouds. Business users want fast access to reliable, real-time information that helps them make better business decisions.

A modern data integration strategy is critical to support the new generation of data and analytics needs, including supporting real-time Customer 360, data intelligence and modern edge applications. Enterprise architecture, technical architecture, and delivery leaders should leverage new approaches to data integration—such as data virtualization, data mesh, artificial intelligence-enabled data integration, and data fabric—to make data and analytics even more effective.

Modern data integration technologies focus on advanced automation, connected data intelligence, and people-based interactive tools that help organizations accelerate various use cases and other data integration needs.

Distributed hybrid and multicloud data creates new integration challenges. Data lives everywhere, so centralizing it in data lakes or data hubs to support business insights is no longer practical, especially given the explosion of data at the edge. Forrester expects adoption of data integration systems to become more widespread in the coming years as organizations seek enabling insights across multicloud, hybrid cloud, and edge environments.

Artificial Intelligence (AI) is driving the next level of data integration solutions. New and innovative AI capabilities help organizations automate data integration functions, including data ingestion, classification, processing, security, and transformation. While AI capabilities are still nascent within data integration, areas technology architecture and deployment leaders can leverage today include the ability to discover connected data, classify and categorize sensitive data, identify duplicates, and orchestrate silos.

Real-time data integration has become critical to supporting modern demands. As the pace of business accelerates, real-time insights become critical, requiring organizations to focus on platforms that can deliver analytics quickly. Enterprises often cite real-time and near-real-time data support as one of their top data integration requirements, primarily to support modern customer experience initiatives.

Customer demand is shifting towards use-case oriented data integration solutions. This new and emerging category provides streamlined and comprehensive end-to-end data integration by automating the process of ingestion, integration, security and transformation for new and emerging business use cases such as Customer 360 and Internet of Things (IoT) analytics.

In mapping the future of technologies in the data integration ecosystem, Forrester identified Data as a Service, Data Mesh, Knowledge Graph, and Query Accelerator as four technologies that fall under the “experimental” category. They are credited with having low tenure and low business value. Most companies should limit their use of these technologies to limited experimentation and wait for the anticipated business value of these newer categories to improve before investing.

Data as a Service

Data as a Service (DaaS) – also known as Data as a Product (DaaP) – provides a common data access layer through application programming interfaces (APIs), SQL, ODBC/JDBC and other protocols and leverages data platforms such as data virtualization, data mesh, integration platform as a Service (iPaaS) and others. These are part of the new generation of advanced data integration technology that focuses on a common data access layer to accelerate different use cases.

DaaS/DaaP provides a data access layer to support queries, reports, data access, and built-in and custom applications. It offers several business benefits including supporting a common view of business and customer data using industry standard protocols.

Forrester expects DaaS/DaaP to continue to grow in the coming years as the demand for trusted and real-time data across all applications grows. We’ll likely see more innovation in real-time updates, integration, and self-service capabilities.

data network

A data mesh offers the opportunity to optimize mixed workloads by matching processing engines and data flows with the right use cases. It interfaces with the event-driven architecture and enables edge use cases to be supported.

A data network provides an architecture that enables a communication layer between applications, machines and people. It aligns the data, queries, and models with the solution so everyone involved—human and machine—stay in sync and speak the same language.

It enables developers, data engineers and architects to become more productive and accelerate various business use cases.

Data mesh technology is still in its infancy. A data mesh leverages service mesh for data, a publish/subscribe (pub/sub) model for the edge, and onboard and local storage and commuting to support a cloud-native architecture. We will likely see Data Mesh evolve into a platform over the long term.

knowledge graph

A knowledge graph uses graph engines to support complex data connections and integrations. It helps build recommendation engines, cleanse data, perform predictive analytics, and connect data quickly. Developers, data engineers, and data architects can quickly work through cluttered, disjointed data to accelerate application development and uncover new business insights.

It uses a graph data model to store, process and integrate connected data and build a knowledge base to answer complex questions and modern insights.

A knowledge graph accelerates analytics and insights that need connected data for applications, insights, or analytics. It also improves the productivity of developers, data engineers, architects, and data analysts.

As a data integration technology, Knowledge Graphs continue to evolve with support for automation, embedded AI/machine learning, and self-service capabilities. A knowledge graph uses a graph model, a data catalog, and domain-specific ontologies to provide a knowledge base.

query accelerator

The query accelerator market has gained traction to help developers and data engineers quickly optimize queries and bring computation closer to the data, thereby minimizing data movement. This technology is useful when you have data stored in data lakes, object stores, or complex data warehouses where optimization queries are often not easy.

Unlike data virtualization systems, query accelerators accelerate queries through an enhanced query optimizer, bringing computational power closer to the data and retrieving only select data from data sources such as distributed databases, data warehouses, data lakes, object stores, and files.

A query accelerator helps companies speed up analysis and data searches through a simplified query that can be run by business analysts, business users, and IT organizations.

Forrester anticipates they will continue to evolve over the coming years, with improved AI/machine learning and data intelligence being incorporated into query acceleration products, combined with higher query performance and scalability with fewer compute resources and more automated integration of distributed data.


This is an excerpt from The Forrester Tech Tide: Enterprise Data Integration, Q4 2021. Noel Yuhanna is a principal analyst and vice president at Forrester.

Comments are closed.