Functionalities that allow companies to analyze and collaborate on combined raw data while maintaining confidentiality)
Generative AI and large language models are here to stay. However, amidst growing concerns for data security among organizations (several companies have banned use of ChatGPT4 at the workplace), there is a need for technology that allows multiple enterprises to use the same data sets and yet retain confidentiality during collaboration.
Enter Confidential Computing. Opaque Systems, a California-based startup with over $30 million in funding is now out to tackle this challenge. The company has announced that it would be enhancing its platform with zero trust data clean rooms or DCRs that would be optimized for Microsoft Azure confidential computing.
Collaborate with datasets without impinging on security
It allows multiple organizations to easily and securely analyze their combined confidential data without sharing or revealing the underlying raw data. With broader support for confidential AI use cases, the Opaque platform will also provide safeguards for machine learning and AI models to execute on encrypted data inside of Trusted Executions Environments (TEEs), preventing exposure to unauthorized parties, says Opaque Systems in a press release.
The company says it would offer a deep dive into these capabilities at the Confidential Computing Summit scheduled for later this week in San Francisco. Opaque Systems is also co-hosting the event with the Confidential Computing Consortium, a Linux Foundation project.
The system allows data share without risks
In a chat with SDxCentral, Opaque Systems CEO and co-founder Rishabh Poddar said adoption of large language models and their use would skyrocket when organizations are able to harness the technology without having to worry about the data exposure risk. This is where confidential computing comes to the fore.
With this shift, computation could be performed on a hardware-based and attested trusted execution environment that prevents unauthorized access or modification of data when in use. This means enterprises or even divisions can share data without the fear of unauthorized use even in a collaborative environment.
Such a move would remove the existing practice of encrypting data at rest in the storage as well as in transit across networks, but not while in use in the memory. “It provides a hardware black box within which you can keep data protected and data remains protected and encrypted at runtime, too,” says Poddar.
There are several use cases already
Moreover, data clean rooms would allow various entities to share data for joint analysis under pre-defined rules where personally identifiable information remains anonymous. Such a scenario allows various entities to collaborate on combined datasets while always remaining within the confines of compliance and data safety regulations.
Opaque platform details several use cases for such a solution such as multiple marketers and advertisers can collaborate on sensitive company data to measure ad campaign performances or even personalize customer targeting. Financial institutions can collaborate in fraud detection while insurers can join hands to identify duplicate claims.
Opaque System’s data clean rooms enable secure, multi-party analytics on encrypted confidential data stored in the system. Users can create clean rooms and perform multi-party analytics and AI on the data without compromising its confidentiality, which means the outcomes are also only available to those who have access or are authorized.
The company also noted that its confidential LLP interface allows users to run their own models within Opaque Platform in the knowledge that their queries and data remain private and protected at all times. They are not exposed to the model or the service provider and nor can they be used by unauthorized persons. “This allows organizations to start putting confidential data to use,” said Podder.