Press Release

Cloudera Unveils AI Inference Service with Embedded NVIDIA NIM Microservices to Accelerate GenAI Development and Deployment 

Cloudera’s AI Inference service boosts LLM performance speeds by 36x using NVIDIA accelerated computing  and NVIDIA  NIM microservices , providing enhanced performance, robust security, and scalable flexibility for enterprises

Combined capability brings together companies’ differentiators in a single offering: Cloudera’s trusted data as the foundation for trusted AI with NVIDIA accelerated computing and the NVIDIA AI Enterprise software platform  to deploy secure and performant AI applications privately on Cloudera

Cloudera, the only true hybrid platform for data, analytics, and AI, today launched Cloudera AI Inference powered by NVIDIA NIM microservices, part of the NVIDIA AI Enterprise platform. As one of the industry’s first AI inference services to provide embedded NIM microservice capability,  Cloudera AI Inference uniquely streamlines the deployment and management of large-scale AI models, allowing enterprises to harness their data’s true potential to advance GenAI from pilot phases to full production.

Recent data from Deloitte reveals the biggest barriers to GenAI adoption for enterprises are compliance risks and governance concerns, yet adoption of GenAI is progressing at a rapid pace, with over two-thirds of organizations increasing their GenAI budgets in Q3 this year. To mitigate these concerns, businesses must turn to running AI models and applications privately – whether on premises or in public clouds. This shift requires secure and scalable solutions that avoid complex, do-it-yourself approaches.

Cloudera AI Inference protects sensitive data from leaking to non-private, vendor-hosted  AI model services  by providing secure development and deployment within enterprise control. Powered by NVIDIA technology, the service helps to build trusted data for trusted AI with high-performance speeds, enabling the efficient development of AI-driven chatbots, virtual assistants, and agentic applications impacting both productivity and new business growth.

The launch of Cloudera AI Inference comes on the heels of the company’s collaboration  with NVIDIA, reinforcing Cloudera’s commitment to driving enterprise AI innovation at a critical moment, as industries navigate the complexities of digital transformation and AI integration.

 Developers can build, customize, and deploy enterprise-grade LLMs with up to 36x faster performance using NVIDIA Tensor Core GPUs and nearly 4x throughput compared with CPUs. The seamless user experience integrates UI and APIs directly with NVIDIA NIM microservice containers, eliminating the need for command-line interfaces (CLI) and separate monitoring systems. The service integration with Cloudera’s AI Model  Registry also enhances security and governance by managing access controls for both model endpoints and operations. Users benefit from a unified platform where all models—whether LLM deployments or traditional models—are seamlessly managed under a single service.

Additional key features of Cloudera AI Inference include:

  • Advanced AI Capabilities: Utilize NVIDIA  NIM microservices to optimize open-source LLMs, including LLama and Mistral, for cutting-edge advancements in natural language processing (NLP), computer vision, and other AI domains.
  • Hybrid Cloud & Privacy: Run workloads on prem or in the cloud, with VPC deployments for enhanced security and regulatory compliance.
  • Scalability & Monitoring: Rely on auto-scaling, high availability (HA), and real-time performance tracking to detect and correct issues, and deliver efficient resource management.
  • Open APIs & CI/CD Integration: Access standards-compliant APIs for model deployment, management, and monitoring for seamless integration with  CI/CD pipelines and MLOps workflows.
  • Enterprise Security: Enforce model access with Service Accounts, Access Control, Lineage, and Auditing features.
  • Risk-Managed Deployment: Conduct A/B testing and canary rollouts for controlled model updates.

 “As enterprises rapidly scale their AI capabilities, the need for trusted data and seamless integration becomes more critical than ever. With GenAI advancing rapidly in India, businesses are increasingly focused on balancing innovation with strong governance and compliance to fully harness its potential. Our partnerships with NVIDIA and integration with Snowflake exemplify Cloudera’s commitment to empowering organizations to innovate at scale in a secure manner,” says Mayank Baid, Regional Vice President, India & South Asia, Cloudera.

“By embedding NVIDIA’s NIM microservices into Cloudera AI Inference, we are offering unprecedented performance and flexibility for AI applications, while ensuring secure and efficient deployment of large-scale AI models to protect sensitive data. At the same time, extending our Open Data lakehouse interoperability Snowflake enables enterprises to leverage an open, unified hybrid data lakehouse powered by Apache Iceberg. Together, these innovations accelerate the journey from data to insight, enabling businesses to drive meaningful outcomes with trusted, compliant AI solutions.”

“Enterprises are eager to invest in GenAI, but it requires not only scalable data but also secure, compliant, and well-governed data,” said industry analyst, Sanjeev Mohan. “Productionizing AI at scale privately introduces complexity that DIY approaches struggle to address. Cloudera AI Inference bridges this gap by integrating advanced data management with NVIDIA’s AI expertise, unlocking data’s full potential while safeguarding it. With enterprise-grade security features like service accounts, access control, and audit, organizations can confidently protect their data and run workloads on prem or in the cloud, deploying AI models efficiently with the necessary flexibility and governance.”

“We are excited to collaborate with NVIDIA to bring Cloudera AI Inference to market, providing a single AI/ML platform that supports nearly all models and use cases so enterprises can both create powerful AI apps with our software and then run those performant AI apps in Cloudera as well,” said Dipto Chakravarty, Chief Product Officer at Cloudera. “With the integration of NVIDIA AI, which  facilitates smarter decision-making through advanced performance, Cloudera is innovating on behalf of its customers by building trusted AI apps with trusted data at scale.”

“Enterprises today need to seamlessly integrate generative AI with their existing data infrastructure to drive business outcomes,” said Kari Briski, vice president of AI software, models and services at NVIDIA. “By incorporating NVIDIA NIM microservices into Cloudera’s AI Inference platform, we’re empowering developers to easily create trustworthy generative AI applications while fostering a self-sustaining AI data flywheel”.

These new capabilities will be  unveiled at Cloudera’s premier AI and data conference, Cloudera EVOLVE NY, taking place Oct. 10. Click here  to learn more about how these latest updates deepen Cloudera’s commitment, elevating enterprise data from pilot to production with GenAI

 

Cloudera partners with Snowflake to unleash hybrid data management integration powered by Iceberg

Unveiled at EVOLVE24, the unified platform will reduce total cost of ownership and provide a single source of truth for all enterprise data

Cloudera, the only true hybrid platform for data, analytics, and AI, today announced an integration with Snowflake, the AI Data Cloud company, to bring enterprises an open, unified hybrid data lakehouse, powered by Apache Iceberg. Now, enterprises can leverage the combination of Cloudera and Snowflake—two best-of-breed tools for ingestion, processing and consumption of data—for a single source of truth across all data, analytics, and AI workloads.

Data is a business’s most powerful asset. It drives informed decision-making, provides a competitive advantage, and reveals opportunities for innovation. A 2022 study revealed 80% of businesses report higher revenue due to real-time data analytics, and 98% report an increase in positive customer sentiment due to leveraging data. However,  to fully harness the power of data, businesses need a single, unified source of truth for storing, managing, and governing all enterprise data, regardless of where it resides.

Cloudera has extended its Open Data Lakehouse interoperability to Snowflake, allowing joint customers seamless access to Cloudera’s Data Lakehouse via its Apache Iceberg REST Catalog.  Customers benefit from an optimized data platform powered by Apache Iceberg, which enables them to ingest, prepare, and process their data with best-in-class tools. Also, Snowflake users can now query data stored on Cloudera’s Ozone, an on-premises AWS S3-compatible object storage solution, directly from Snowflake.  Customers now have access to all major form factors from one cohesive collaboration, on-premise, and as a platform-as-a-service (PaaS)  and software-as-a -service (SaaS)

In addition to enabling greater interoperability between the two systems, Cloudera customers will experience the ease of Snowflake’s Business Intelligence engine. The Snowflake engine can access data from Cloudera’s Open Data Lakehouse without requiring data duplication or transfer, reducing complexity, streamlining operations and maintaining data integrity.

Moreover, this collaboration leads to a reduction in the total cost of ownership of the integrated stack for enterprises. The elimination of data and metadata silos, rationalization of data pipelines, and streamlining of operational efforts are key factors in this cost reduction. These improvements help deliver analytics and AI use cases at scale more efficiently, further enhancing the value proposition for businesses leveraging both Cloudera and Snowflake. This strategic integration not only optimizes analytic workflows but also provides a robust framework for enterprises to drive innovation and gain competitive advantages in their respective markets.

Additional benefits of this integration include:

  • Managed Iceberg Tables: Iceberg tables enhance data performance and reliability, allowing joint customers to unlock the full potential of their data through better organization, faster queries, and simplified data management, regardless of where the data is stored.
  • Best-of-Breed Engines: Joint customers benefit from top-tier engines to ingest, prepare, and manage their data, enabling seamless management of both artificial intelligence (AI) and business intelligence workloads.
  • Unified Security and Governance: This integration consolidates data security and governance across the entire data lifecycle. Joint customers can apply consistent security measures, track data origin and movement, and manage metadata within a single platform, on-prem or the cloud.

“By extending our open data lakehouse capabilities through Apache Iceberg to Snowflake, we’re enabling our customers to not only optimize their data workflows but also unlock new opportunities for innovation, efficiency, and growth,” said Abhas Ricky, Chief Strategy Officer of Cloudera. “This will help customers simplify their data architecture, minimize data pipelines, and reduce total cost of ownership of their data estate while reducing security risks. Together, Snowflake and Cloudera are bringing about the next era of data-driven decision making for every modern organization.”

“Apache Iceberg is a leading choice for customers who want open standards for data, and Cloudera has been an integral part of the Iceberg project,” said Tarik Dwiek, Head of Technology Alliances at Snowflake. “Our partnership expands what’s possible for customers who choose to standardize on Iceberg tables. We are excited to break down silos and deliver a unified hybrid data cloud experience with multi-function analytics to all of our customers.”

“Through this collaboration, customers gain access to a unified, robust data management platform that provides a single source of truth for all of their data, whether in the cloud or on- premises,” said Sanjeev Mohan, analyst at SanjMo. “This enables them to streamline and secure their data operations while efficiently analyzing and extracting insights across the entire data lifecycle – from ingestion to AI and analytics. It’s a strategic move from two industry giants to partner in a way that will deliver immediate value to businesses.”

In addition, reaffirming our commitment to advancing Iceberg adoption, Cloudera is excited to announce the technical preview of Cloudera Lakehouse Optimizer. This new service autonomously optimizes your Iceberg tables, further reducing costs while significantly enhancing the performance of your Lakehouse. To learn more about this technical preview, click here.

About Cloudera
Cloudera is the only true hybrid platform for data, analytics, and AI. With 100x more data under management than other cloud-only vendors, Cloudera empowers global enterprises to transform data of all types, on any public or private cloud, into valuable, trusted insights. Our open data lakehouse delivers scalable and secure data management with portable cloud-native analytics, enabling customers to bring GenAI models to their data while maintaining privacy and ensuring responsible, reliable AI deployments. The world’s largest brands in financial services, insurance, media, manufacturing, and government rely on Cloudera to use their data to solve what seemed impossible—today and in the future.