image
image
user-login
Patent search/

Adaptive Privacy-Preserving Framework for Secure Data Collaboration

search

Patent Search in India

  • tick

    Extensive patent search conducted by a registered patent agent

  • tick

    Patent search done by experts in under 48hrs

₹999

₹399

Talk to expert

Adaptive Privacy-Preserving Framework for Secure Data Collaboration

ORDINARY APPLICATION

Published

date

Filed on 28 October 2024

Abstract

ABSTRACT Adaptive Privacy-Preserving Framework for Secure Data Collaboration The invention discloses a privacy-preserving framework for decentralized data collaboration, utilizing techniques like encryption, federated learning, and dynamic privacy management to safeguard sensitive data. The system includes modules for data pre-processing, model training, privacy management, and evaluation, ensuring compliance with privacy regulations while enabling effective collaboration. The framework adapts to varying privacy requirements, ensuring the secure and efficient use of data across multiple entities. Fig.2

Patent Information

Application ID202441082307
Invention FieldCOMPUTER SCIENCE
Date of Application28/10/2024
Publication Number44/2024

Inventors

NameAddressCountryNationality
Dr. L. ShakkeeraItgalpur, Rajanakunte, Bengaluru, Karnataka – 560 064, IndiaIndiaIndia
Sakthivel EItgalpur, Rajanakunte, Bengaluru, Karnataka – 560 064, IndiaIndiaIndia
Dr. Sharmasth Vali. YItgalpur, Rajanakunte, Bengaluru, Karnataka – 560 064, IndiaIndiaIndia
Dr. Blessed Prince PItgalpur, Rajanakunte, Bengaluru, Karnataka – 560 064, IndiaIndiaIndia
Rohini AItgalpur, Rajanakunte, Bengaluru, Karnataka – 560 064, IndiaIndiaIndia

Applicants

NameAddressCountryNationality
Presidency UniversityItgalpur, Rajanakunte, Bengaluru, Karnataka – 560 064, IndiaIndiaIndia

Specification

Description:FIELD OF THE INVENTION
The present invention generally relates to data privacy and security in decentralized environments, particularly relates to the protection of sensitive information during collaborative data processing.

BACKGROUND OF THE INVENTION
The rapid expansion of data-driven technologies across industries such as healthcare, finance, telecommunications, and autonomous systems has highlighted the critical need for secure and efficient data collaboration. With growing amounts of sensitive data being processed, organizations are increasingly required to collaborate without exposing private information. Traditionally, data-sharing practices involve direct transfer of raw data, which poses serious privacy risks and leaves organizations vulnerable to data breaches and unauthorized access. In particular, industries that handle personal and sensitive data, like healthcare, face stringent regulatory requirements under laws such as the Health Insurance Portability and Accountability Act (HIPAA) and the General Data Protection Regulation (GDPR). These regulations impose significant constraints on how data can be shared and processed, creating challenges for institutions seeking to collaborate while maintaining compliance.

Several methods have been explored in the past to address the need for secure data sharing, including anonymization techniques, data encryption, and centralized data-sharing platforms. Prior art in this space typically relies on anonymizing datasets to remove personally identifiable information before sharing or applying encryption to data transfers. However, these approaches have inherent limitations. Anonymization methods are often prone to re-identification attacks, where supposedly anonymous data can be reverse-engineered to identify individuals. Encryption techniques, while effective for securing data at rest or in transit, are computationally expensive and often not sufficient to ensure privacy when data is actively processed or analyzed. Additionally, centralized platforms pose single points of failure, where a breach of the central server can compromise the security of all shared data.

One of the significant problems in the prior art is the challenge of balancing data utility with privacy. Existing methods often degrade the quality of data to ensure privacy, making it less useful for collaborative purposes such as machine learning and data analysis. Furthermore, traditional encryption techniques, while effective for securing data, tend to introduce inefficiencies, particularly when dealing with large-scale datasets that require real-time processing. Another major concern with prior art methods is their inability to adapt to the varying privacy requirements of different sectors and regions. This lack of flexibility often leads to non-compliance with region-specific regulations, exposing organizations to legal risks and penalties.
There is, therefore, an urgent need for an improved system that allows multiple entities to collaborate on data-driven initiatives without sharing raw data and without sacrificing data utility or privacy. Such a system must not only ensure the privacy of sensitive information during the collaboration process but also comply with diverse and evolving regulatory standards. The invention must address the inefficiencies and vulnerabilities of existing solutions, providing a more secure, adaptable, and privacy-conscious method for decentralized data collaboration. The current invention is directed towards fulfilling these unmet needs, offering a new approach to secure collaborative data processing that overcomes the limitations of the prior art.

OBJECTS OF THE INVENTION
It is the primary object of the invention to provide a secure and privacy-preserving framework for decentralized data collaboration.

It is another object of the invention to ensure compliance with regulatory standards without direct data sharing.

It is another object of the invention to protect sensitive information through advanced encryption and privacy techniques.

It is another object of the invention to enable decentralized model training across different entities while safeguarding sensitive data.

It is yet another object of the invention to dynamically adapt the system to varying privacy requirements and security risks.

SUMMARY OF THE INVENTION
To meet the objects of the invention, it is disclosed here a system for privacy-preserving decentralized data collaboration, comprises: a data pre-processing module; a model training module; a privacy management module; and an evaluation module; wherein the data pre-processing module is configured to anonymize and encrypt data from multiple sources, the model training module conducts decentralized model training using encrypted data and aggregates model updates from multiple entities, the privacy management module dynamically adjusts privacy settings based on data sensitivity and regulatory requirements, the evaluation module assesses the performance and privacy preservation of the global model, and wherein the said modules are operationally connected to each other.

Further disclosed here, a method for privacy-preserving decentralized data collaboration, comprising the steps of: anonymizing and encrypting data using homomorphic encryption and differential privacy techniques; training local models on encrypted data and aggregating the model updates without sharing sensitive data using federated learning; dynamically adjusting privacy settings based on data sensitivity and regulatory mandates; and evaluating the global model's performance and ensuring privacy preservation through encrypted validation data and privacy assurance tests.

BRIEF DESCRIPTION OF THE FIGURES
Fig. 1 illustrates the privacy-preserving secure mechanism utilized by the framework for decentralized data collaboration.
Fig. 2 depicts the modular architecture of the adaptive privacy-preserving framework, highlighting its core components for data pre-processing, model training, privacy management, and evaluation.

DETAILED DESCRIPTION OF THE INVENTION
The invention presents a privacy-preserving system for secure decentralized data collaboration. It ensures that sensitive data remains protected during collaborative processes, such as model training, through the use of encryption and privacy-enhancing technologies. The system is modular, comprising data pre-processing, model training, privacy management, and evaluation modules. These modules work in unison to facilitate secure data collaboration, while complying with regulatory requirements. Key techniques, such as homomorphic encryption, differential privacy, and secure multi-party computation, are employed to ensure that data privacy is maintained throughout the collaboration process. The framework adapts its privacy settings dynamically based on changing requirements, ensuring continuous protection of data.

The present invention discloses a new federated learning framework that ensures secure collaborative AI model training in the healthcare sector. Given the sensitive nature of healthcare data and stringent regulations such as HIPAA and GDPR, traditional data-sharing methods often fall short and expose institutions to regulatory risks. This framework utilizes cutting-edge privacy-preserving techniques, including differential privacy, homomorphic encryption, and secure multi-party computation. By sharing only encrypted model updates instead of raw patient data, the framework safeguards individual privacy while maintaining the integrity and effectiveness of AI models. Its adaptable design accommodates varying privacy requirements and regulatory standards, facilitating secure and compliant data collaboration. This innovative solution supports the creation of more robust AI models while addressing both security and compliance challenges in healthcare.

The disclosed framework for secure healthcare data collaboration employs a four-module architecture: Data Pre-processing, Model Training, Privacy Management, and Evaluation, each crucial for privacy-preserving federated learning and adherence to healthcare regulations. Data pre-processing focuses on anonymizing and pseudonymizing sensitive healthcare information to safeguard patient privacy, though these methods must overcome challenges such as re-identification risks and varying regulatory standards. Federated learning allows for decentralized model training across data-holding sites, ensuring data remains with its owner, but requires additional security measures like encryption to mitigate risks of data breaches and tampering. Differential privacy (DP) adds noise to data or model outputs to obscure individual identities, though it can affect data quality and needs careful application, particularly in medical imaging contexts. Homomorphic encryption (HE) facilitates computations on encrypted data while preserving privacy, though it faces efficiency constraints, especially with large datasets. Secure Multi-Party Computation (SMPC) supports collaborative data analysis without exposing individual data, but deals with issues related to communication overhead and scalability. Collectively, these techniques offer a robust solution for secure data collaboration, addressing privacy concerns while enabling effective use of healthcare data in compliance with regulations which is explained in Fig 1.

The detailed framework given in Fig 2, for secure healthcare data collaboration is implemented through four core modules: Data Pre-processing, Model Training, Privacy Management, and Evaluation. Each module is designed with specific inputs and outputs to ensure privacy-preserving federated learning while adhering to stringent healthcare data regulations.

Data Pre-processing Module
The Data Pre-processing Module is the first step in the framework, focusing on preparing raw healthcare data for secure federated learning. This module processes data from various sources, including Electronic Health Records (EHRs), diagnostic images, patient demographics, and lab results, while ensuring compliance with institutional privacy policies and regulatory guidelines such as HIPAA and GDPR. The module performs encryption using homomorphic encryption techniques, like Paillier or BGV, to maintain data confidentiality during computations. Additionally, it applies anonymization methods such as k-anonymity and differential privacy to obscure patient identities, ensuring individual records remain untraceable. The result is encrypted and anonymized data, ready for use in federated learning without compromising patient privacy.

Model Training Module
Following data pre-processing, the Model Training Module takes over to conduct federated learning on the secured data. This module uses encrypted and anonymized data alongside initial model parameters, which may be pre-trained or randomly initialized. Federated Averaging (FedAvg) is employed to aggregate local model updates from various institutions into a global model, ensuring that sensitive data is not shared directly. Differential privacy techniques, such as Gaussian or Laplace mechanisms, are applied to model updates to prevent data leakage. Secure Multi-Party Computation (SMPC) protocols, including Yao's Garbled Circuits and Shamir Secret Sharing, are used to aggregate updates securely. The outputs of this module are encrypted model updates sent to the central server and a globally aggregated model reflecting combined insights from participating institutions.

Privacy Management Module
The Privacy Management Module is responsible for adapting privacy settings to ensure compliance with regulatory and institutional requirements. It takes inputs including institutional policies and regulatory mandates (e.g., HIPAA, GDPR) and feedback from the Model Training Module regarding data sensitivity and encryption levels. The module uses adaptive algorithms, such as Reinforcement Learning, to dynamically adjust privacy settings based on evolving requirements. Role-Based Access Control (RBAC) is implemented to manage data access permissions effectively. The module outputs updated privacy settings and refined access control policies, which are applied in subsequent training rounds to maintain compliance with privacy regulations and institutional standards.

Evaluation Module
The Evaluation Module is tasked with assessing both the performance and privacy of the federated AI models. It receives the global model from the Model Training Module and validation data that has been encrypted and anonymized. Performance is evaluated using metrics such as accuracy, precision, recall, and AUC-ROC to gauge model effectiveness. Privacy assurance tests, including checks for privacy leakage and membership inference attacks, are conducted to verify the effectiveness of privacy-preserving measures. Secure Aggregation techniques are employed to monitor the federated learning process and detect any potential privacy issues. The outputs of this module include detailed performance metrics and a privacy assurance report, which confirms the framework's adherence to data protection standards and validates the secure and effective performance of the model.

It finds application in various industries, including healthcare, smart cities, finance, telecommunications, autonomous vehicles, and education.

, Claims:We Claim:

1. A system for privacy-preserving decentralized data collaboration, comprises:
a data pre-processing module;
a model training module;
a privacy management module; and
an evaluation module;
wherein the data pre-processing module is configured to anonymize and encrypt data from multiple sources, the model training module conducts decentralized model training using encrypted data and aggregates model updates from multiple entities, the privacy management module dynamically adjusts privacy settings based on data sensitivity and regulatory requirements, the evaluation module assesses the performance and privacy preservation of the global model, and wherein the said modules are operationally connected to each other.

2. The system as claimed in claim 1, wherein the data pre-processing module applies homomorphic encryption techniques to maintain data confidentiality during computations.

3. The system as claimed in claim 1, wherein the data pre-processing module uses differential privacy to anonymize the data and prevent re-identification.

4. The system as claimed in claim 1, wherein the model training module aggregates local model updates using federated learning, without requiring the sharing of raw data.

5. The system as claimed in claim 4, wherein the model training module utilizes secure multi-party computation protocols to ensure that no sensitive data is shared during the aggregation process.

6. The system as claimed in claim 1, wherein the privacy management module adapts the system's privacy protocols using adaptive algorithms based on evolving regulatory and institutional requirements.

7. The system as claimed in claim 1, wherein the privacy management module enforces role-based access control to restrict access to sensitive data.

8. The system as claimed in claim 1, wherein the evaluation module assesses the model's performance using encrypted validation data.

9. The system as claimed in claim 1, wherein the evaluation module performs privacy assurance tests to ensure compliance with privacy regulations.

10. A method for privacy-preserving decentralized data collaboration, comprising the steps of:
anonymizing and encrypting data using homomorphic encryption and differential privacy techniques;
training local models on encrypted data and aggregating the model updates without sharing sensitive data using federated learning;
dynamically adjusting privacy settings based on data sensitivity and regulatory mandates; and
evaluating the global model's performance and ensuring privacy preservation through encrypted validation data and privacy assurance tests.

Documents

NameDate
202441082307-Proof of Right [08-11-2024(online)].pdf08/11/2024
202441082307-EDUCATIONAL INSTITUTION(S) [29-10-2024(online)].pdf29/10/2024
202441082307-FORM-8 [29-10-2024(online)].pdf29/10/2024
202441082307-FORM-9 [29-10-2024(online)].pdf29/10/2024
202441082307-COMPLETE SPECIFICATION [28-10-2024(online)].pdf28/10/2024
202441082307-DECLARATION OF INVENTORSHIP (FORM 5) [28-10-2024(online)].pdf28/10/2024
202441082307-DRAWINGS [28-10-2024(online)].pdf28/10/2024
202441082307-EDUCATIONAL INSTITUTION(S) [28-10-2024(online)].pdf28/10/2024
202441082307-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [28-10-2024(online)].pdf28/10/2024
202441082307-FORM 1 [28-10-2024(online)].pdf28/10/2024
202441082307-FORM 18 [28-10-2024(online)].pdf28/10/2024
202441082307-FORM FOR SMALL ENTITY(FORM-28) [28-10-2024(online)].pdf28/10/2024
202441082307-POWER OF AUTHORITY [28-10-2024(online)].pdf28/10/2024
202441082307-REQUEST FOR EXAMINATION (FORM-18) [28-10-2024(online)].pdf28/10/2024

footer-service

By continuing past this page, you agree to our Terms of Service,Cookie PolicyPrivacy Policy  and  Refund Policy  © - Uber9 Business Process Services Private Limited. All rights reserved.

Uber9 Business Process Services Private Limited, CIN - U74900TN2014PTC098414, GSTIN - 33AABCU7650C1ZM, Registered Office Address - F-97, Newry Shreya Apartments Anna Nagar East, Chennai, Tamil Nadu 600102, India.

Please note that we are a facilitating platform enabling access to reliable professionals. We are not a law firm and do not provide legal services ourselves. The information on this website is for the purpose of knowledge only and should not be relied upon as legal advice or opinion.