• 3,000 firms
  • Independent
  • Trusted
Save up to 70% on staff

News » IBM completes $11Bn Confluent acquisition to power AI with live data

IBM completes $11Bn Confluent acquisition to power AI with live data

IBM completes $11Bn Confluent acquisition to power AI with live data

NEW YORK and CALIFORNIA, UNITED STATES — IBM, a technology services provider, has finalized its $11 billion acquisition of Confluent, a data streaming platform used by 40% of the Fortune 500, marking a strategic push to solve the critical enterprise barrier of delivering clean, governed, real-time data to artificial intelligence (AI) models and agents

The all-cash deal, valued at $31 per share, combines IBM’s hybrid cloud and AI portfolio with Confluent’s event streaming technology to create a unified platform that enables AI operations to function on continuously refreshed data rather than delayed, siloed information.

Addressing data barrier to AI production

As enterprises move from AI testing to full-scale production, the challenge that will keep them from succeeding is no longer the complexity of models but rather the quality, management, and speed of the data available to these systems. 

IDC predicts that over one billion new logical applications will appear by 2028, created by a new generation of AI that will not provide value unless the underlying data is live, trusted, and streaming. 

This need will require a radically new type of database. By incorporating Confluent’s Apache Kafka-based platform, IBM aims to offer enterprises a single managed environment where AI models and agents can operate with real-time context across all settings. 

This would eliminate the need to manually relay data to them, a process that can take hours or even days to be received and processed.

Rob Thomas, the Senior Vice President of IBM Software and Chief Commercial Officer, pointed out that “Transactions happen in milliseconds, and AI decisions need to happen just as fast.”

“With Confluent, we are giving clients the ability to move trusted data continuously across their entire operation so their AI models and agents can act on what is happening right now, not on data that is hours old.”

This strategy is intended to create a new operating model in businesses in which AI operates on live data to make real-time decisions and deliver value at scale.

The urgency is underscored by existing customer implementations, such as Michelin managing real-time inventory across a 170-country supply chain to achieve 35% cost savings, and BMW Group streaming IoT data from over 30 production sites to connect factory floor systems with cloud applications.

Immediate product synergies across IBM’s portfolio

After the acquisition, IBM is swiftly adopting all of Confluent’s event streaming features across its core software offerings to establish direct synergies that improve AI preparedness and hybrid cloud functionality

One of the central integrations would be to stream live operational events into WatsonX.data to ensure that the enterprise’s AI technologies (such as models, agents, and automated processes) operate on ever-fresh data that includes essential elements such as lineage, policy analysis, and quality controls.

This addresses the fundamental need for AI to access current context rather than relying on yesterday’s static data, effectively turning enterprise data streams into actionable intelligence.

Additionally, the move strengthens IBM’s capabilities in modernizing mainframe operations for the AI era by enabling IBM Z to identify and drive real-time events at the transaction source, streaming transactional data directly into analytics and AI workflows. 

The combined platform also extends IBM’s event-driven automation across hybrid environments by integrating IBM MQ and webMethods Hybrid Integration with Confluent’s high-scale event streaming, allowing applications, APIs, and AI agents to sense and respond to business events instantly. 

Sanjeev Mohan, Principal Analyst, SanjMo, notes, “The shift from AI experimentation to production deployment has exposed a critical gap in enterprise data architecture: the inability to deliver trusted, real-time data to the systems that need it most.”

“AI agents and automated workflows don’t operate on historical data; they require live operational signals, continuously flowing across the enterprise as events occur.”

With support from IBM Consulting and partners, the company is positioning this unified infrastructure as the foundational fabric through which AI agents can access the information they need, with the controls, governance, and real-time velocity required to operate safely and at enterprise scale.

Start your
journey today

  • Independent
  • Free
  • Transparent

About OA

Outsource Accelerator is the trusted source of independent information, advisory and expert implementation of Business Process Outsourcing (BPO)

The #1 outsourcing authority

Outsource Accelerator offers the world’s leading aggregator marketplace for outsourcing. It specifically provides the conduit between Philippines outsourcing suppliers and the businesses – clients – across the globe.

The Outsource Accelerator website has over 5,000 articles, 450+ podcast episodes, and a comprehensive directory with 4000+ BPO companies… all designed to make it easier for clients to learn about – and engage with – outsourcing.

About Derek Gallimore

Derek Gallimore has been in business for 20 years, outsourcing for over eight years, and has been living in Manila (the heart of global outsourcing) since 2014. Derek is the founder and CEO of Outsource Accelerator, and is regarded as a leading expert on all things outsourcing.

“Excellent service for outsourcing advice and expertise for my business.”

Learn more
Banner Image
Get 3 Free Quotes Verified Outsourcing Suppliers
3,000 firms.Just 2 minutes to complete.
SAVE UP TO
70% ON STAFF COSTS
Learn more

Connect with over 3,000 outsourcing services providers.

Banner Image

Transform your business with skilled offshore talent.

  • 3,000 firms
  • Simple
  • Transparent
Banner Image