Coredge IO Unveils Affordable Inference Platform Powered By Qualcomm at India Mobile World Congress 2024
Coredge IO Pvt Ltd, a cutting-edge sovereign AI and cloud platform company that was recently acquired by Sirius Digitech Limited, a joint venture between the Adani Group and Sirius International Holding (Sirius), a subsidiary of International Holding Company (IHC), is set to introduce an artificial intelligence (AI) Inference-as-a-Service platform designed to make advanced AI accessible and affordable for businesses of all sizes.
Coredge’s new platform, powered by Qualcomm® Cloud AI 100 Ultra, a product of Qualcomm Technologies, Inc, will debut at the India Mobile Congress 2024, offering a revolutionary solution that addresses the high costs of AI model inference.
“At Coredge IO, we take an innovative approach to sovereign cloud technology, offering highly secure, scalable, and designed-for-AI cloud solutions tailored for government and enterprise clients. Our AI inference platform will utilize the Qualcomm AI 100 Ultra inference accelerator designed to provide power efficiency and scalability to deliver industry-leading performance per cost. This ensures a lower TCO, making the solution highly economical for users. We’re excited to bring this offering to market where customers only pay for the computational resources consumed during inference, making it a highly attractive option for businesses of all sizes,” said Arif Khan, CEO of Coredge IO.
Affordable and Scalable AI for All
AI inference—the process of running AI models at scale—has long been a challenge for startups, SMEs, and even large enterprises due to the immense computational resources required. Coredge’s new AI Inference-as-a-Service platform with AI 100 Ultra inferencing accelerators aims to solve this by offering a cost-effective pay-as-you-go model designed to offer one of the lowest Total Cost of Ownership (TCO) for AI inference for AI developers and businesses alike.
“Working with Coredge IO, we intend to drive significant advancements in India’s Cloud AI ambitions. Through this partnership we aim to open new developer opportunities for advanced cloud applications that aligns with India’s AI direction, with the cost advantages offered by Qualcomm’s AI 100 Ultra inference accelerator and the AI Inference-as-a-Service platform while enabling sustainable AI compute with best-in-class TCO,” said Savi Soin, Senior Vice President & President, Qualcomm India.
Expanded Integration for Maximum Efficiency
Coredge IO’s planned work with Qualcomm Technologies is intended to expand Coredge’s new platform’s capabilities with Cloud AI 100 Ultra and Qualcomm Technologies’ Inference-as-a-Service solution is be designed to support:
- Best-in-Class TCO for inference services, making this platform highly competitive for both SMEs and large-scale enterprises.
- Co-bidding Opportunities for government and private AI cloud businesses, with a special focus on regional language models like the Bhashini initiative.
- Proof-of-Concept Initiatives are planned, including projects with Bhashini and Adani Connex for demonstrating the efficiency of AI inference in Indian Sovereign Cloud and edge computing.
- Edge AI Deployment to support the orchestration and deployment of AI models on edge devices through seamless integration with the Coredge Platform.
Qualcomm Cloud AI 100 Ultra
The AI 100 Ultra is designed to support large language models (LLMs) and generative AI, enabling up to four times the performance of the previous generation. This power efficiency makes it suitable for high-performance AI inference on a large scale, supporting models with over 100 billion parameters. Coredge’s intended work with Qualcomm Technologies includes the utilization of Qualcomm Technologies’ AI cores and software to offer developers a powerful inference platform that optimizes cost and performance, as well utilizing the platform for high-priority use cases, such as Adani Connex and the Indian Sovereign Cloud.
Targeting a Diverse Audience
The platform is intended for a wide range of users, including:
- Startups and SMEs: Businesses that lack the resources to maintain dedicated AI infrastructure and want to develop applications for local languages.
- AI/ML Developers: The platform allows developers to deploy and test models quickly with an easy-to-use interface and affordable pricing.
- Enterprises: For businesses requiring large-scale AI inference, the platform provides scalable infrastructure at a fraction of traditional costs.
- AI Enthusiasts: Freelancers and independent developers who require an affordable inference platform for personal or project-based use.
Future-Ready with Web 3.0 Integration
While initially hosted on centralized servers, the platform’s roadmap includes transitioning to a decentralized Web 3.0 architecture powered by blockchain. This approach will enable resource sharing across a secure distributed network, democratizing AI inference even further. Users and organizations will be able to contribute computational power to the network, creating a more scalable and robust AI ecosystem.
Strategic Launch at India Mobile Congress 2024
Coredge will demonstrate the platform’s capabilities at India Mobile Congress 2024, showcasing its capabilities to industry leaders, government officials, and AI innovators, providing a glimpse into the future of AI in India. The platform will also feature integrations with government-backed initiatives like the Bhashini Project, emphasizing its focus on regional language AI applications.
About Coredge.io
Coredge.io is a sovereign AI & Cloud technology trailblazer, providing secure, scalable, and compliant cloud solutions to government and enterprise clients. They aim to ensure data sovereignty while delivering cutting-edge AI platform capabilities that drive efficiency and innovation.
About Qualcomm
Qualcomm branded products are products of Qualcomm Technologies, Inc. and/or its subsidiaries