IBM: $11 Billion Acquisition Of Confluent Completed

By Amit Chowdhry • Today at 3:55 PM

IBM announced it has completed its acquisition of Confluent, a data streaming platform used by more than 6,500 enterprises, in a deal valued at approximately $11 billion. The transaction, priced at $31 per share in cash, is aimed at strengthening IBM’s ability to deliver real-time, governed data for enterprise AI, agents, and automated workflows.

The combined platform is designed to address one of the biggest barriers to scaling AI in production environments: access to clean, continuously updated, and trusted data. IBM and Confluent are positioning their joint offering as a unified data foundation that enables AI systems to operate on live data across on-premises and hybrid cloud environments.

Confluent’s technology, built on Apache Kafka, enables organizations to stream data in real time across systems. IBM plans to integrate this capability across its portfolio, including watsonx.data, IBM MQ, webMethods Hybrid Integration, and IBM Z systems. These integrations are intended to support event-driven automation, real-time analytics, and AI-driven decision-making.

The acquisition reflects a broader shift in enterprise technology, as organizations move from AI experimentation to operational deployment. Industry estimates suggest that more than one billion new applications will emerge by 2028, many of which will rely on real-time data flows rather than static datasets.

Confluent’s existing customer base includes major global enterprises across industries such as financial services, manufacturing, retail, and healthcare. Use cases highlighted include real-time supply chain management, inventory optimization, IoT data streaming, and large-scale machine learning operations.

IBM said the combined offering will enable organizations to break down data silos and create a continuous flow of operational data across systems. This capability is expected to help enterprises support AI agents and workflows that require immediate access to current data rather than delayed or batch-processed information.

The company also emphasized the role of its consulting and partner ecosystem in helping clients implement real-time data architectures. By combining governance for data at rest with streaming capabilities for data in motion, IBM aims to provide a comprehensive platform for enterprise AI deployment at scale.

KEY QUOTES:

“Transactions happen in milliseconds, and AI decisions need to happen just as fast. With Confluent, we are giving clients the ability to move trusted data continuously across their entire operation so their AI models and agents can act on what is happening right now, not on data that is hours old. Together, IBM and Confluent give enterprises the foundation for a new operating model – one where AI runs on live data, drives decisions in real time, and delivers value at scale.”

Rob Thomas, Senior Vice President, IBM Software and Chief Commercial Officer

“Since our founding, Confluent’s mission has been to set the world’s data in motion, making data streaming as foundational to the enterprise as the database. Joining IBM allows us to accelerate that mission at a much greater scale. IBM’s global reach and deep enterprise relationships will help us go further, faster. As enterprises move from experimenting with AI to running their business on it, helping data flow continuously across the business has never mattered more. I’m excited to see what we’ll build together.”

Jay Kreps, CEO and Co-founder of Confluent

“The shift from AI experimentation to production deployment has exposed a critical gap in enterprise data architecture: the inability to deliver trusted, real-time data to the systems that need it most. AI agents and automated workflows don’t operate on historical data; they require live operational signals, continuously flowing across the enterprise as events occur. IBM has made significant progress assembling a portfolio that addresses both sides of this equation: governance and infrastructure for data at rest, and a platform for data in motion. For enterprises whose architecture and priorities align with this approach, it is a compelling stack worth evaluating.”

Sanjeev Mohan, Principal Analyst, SanjMo