image
image
user-login
Patent search/

NEURAL NETWORK ARCHITECTURE FOR DEEP LEARNING IN HEALTHCARE DIAGNOSTICS

search

Patent Search in India

  • tick

    Extensive patent search conducted by a registered patent agent

  • tick

    Patent search done by experts in under 48hrs

₹999

₹399

Talk to expert

NEURAL NETWORK ARCHITECTURE FOR DEEP LEARNING IN HEALTHCARE DIAGNOSTICS

ORDINARY APPLICATION

Published

date

Filed on 14 November 2024

Abstract

The present invention relates to a novel neural network architecture designed for deep learning applications in healthcare diagnostics. The architecture integrates multi-modal healthcare data, including medical imaging, structured clinical records, and unstructured data such as physician notes, to improve diagnostic accuracy and predict patient outcomes. By employing specialized layers for each data modality, an attention mechanism for feature prioritization, and an explainability component for transparent decision-making, the system enhances both the accuracy and interpretability of predictions. The architecture is optimized for computational efficiency, enabling deployment in diverse healthcare environments, from large hospitals to resource-constrained clinics.

Patent Information

Application ID202441088216
Invention FieldBIO-MEDICAL ENGINEERING
Date of Application14/11/2024
Publication Number47/2024

Inventors

NameAddressCountryNationality
N. SubramanyamAssistant Professor, Audisankara College of Engineering & Technology(AUTONOMOUS), NH-16, By-Pass Road, Gudur, Tirupati Dist., Andhra Pradesh, India-524101, India.IndiaIndia
A. JeetheshwarFinal Year B.Tech Student, Audisankara College of Engineering & Technology(AUTONOMOUS),NH-16, By-Pass Road, Gudur, Tirupati Dist., Andhra Pradesh, India-524101, India.IndiaIndia
A. MounikaFinal Year B.Tech Student, Audisankara College of Engineering & Technology(AUTONOMOUS), NH-16, By-Pass Road, Gudur, Tirupati Dist., Andhra Pradesh, India-524101, India.IndiaIndia
A. SirishaFinal Year B.Tech Student, Audisankara College of Engineering & Technology(AUTONOMOUS), NH-16, By-Pass Road, Gudur, Tirupati Dist., Andhra Pradesh, India-524101, India.IndiaIndia
A. Siva KumarFinal Year B.Tech Student, Audisankara College of Engineering & Technology(AUTONOMOUS), NH-16, By-Pass Road, Gudur, Tirupati Dist., Andhra Pradesh, India-524101, India.IndiaIndia
A. SravaniFinal Year B.Tech Student, Audisankara College of Engineering & Technology(AUTONOMOUS), NH-16, By-Pass Road, Gudur, Tirupati Dist., Andhra Pradesh, India-524101, India.IndiaIndia
A. UsmanFinal Year B.Tech Student, Audisankara College of Engineering & Technology(AUTONOMOUS), NH-16, By-Pass Road, Gudur, Tirupati Dist., Andhra Pradesh, India-524101, India.IndiaIndia
A. Vijay KumarFinal Year B.Tech Student, Audisankara College of Engineering & Technology(AUTONOMOUS), NH-16, By-Pass Road, Gudur, Tirupati Dist., Andhra Pradesh, India-524101, India.IndiaIndia
A. KavyaFinal Year B.Tech Student, Audisankara College of Engineering & Technology(AUTONOMOUS), NH-16, By-Pass Road, Gudur, Tirupati Dist., Andhra Pradesh, India-524101, India.IndiaIndia
A. AbhishekFinal Year B.Tech Student, Audisankara College of Engineering & Technology(AUTONOMOUS), NH-16, By-Pass Road, Gudur, Tirupati Dist., Andhra Pradesh, India-524101, India.IndiaIndia

Applicants

NameAddressCountryNationality
Audisankara College of Engineering & TechnologyAudisankara College of Engineering & Technology, NH-16, By-Pass Road, Gudur, Tirupati Dist, Andhra Pradesh, India-524101, India.IndiaIndia

Specification

Description:In the following description, for the purposes of explanation, various specific details are set forth in order to provide a thorough understanding of embodiments of the present disclosure. It will be apparent, however, that embodiments of the present disclosure may be practiced without these specific details. Several features described hereafter can each be used independently of one another or with any combination of other features. An individual feature may not address all of the problems discussed above or might address only some of the problems discussed above. Some of the problems discussed above might not be fully addressed by any of the features described herein.

The ensuing description provides exemplary embodiments only and is not intended to limit the scope, applicability, or configuration of the disclosure. Rather, the ensuing description of the exemplary embodiments will provide those skilled in the art with an enabling description for implementing an exemplary embodiment. It should be understood that various changes may be made in the function and arrangement of elements without departing from the spirit and scope of the disclosure as set forth.

Specific details are given in the following description to provide a thorough understanding of the embodiments. However, it will be understood by one of ordinary skill in the art that the embodiments may be practiced without these specific details. For example, circuits, systems, networks, processes, and other components may be shown as components in block diagram form in order not to obscure the embodiments in unnecessary detail. In other instances, well-known circuits, processes, algorithms, structures, and techniques may be shown without unnecessary detail to avoid obscuring the embodiments.

Also, it is noted that individual embodiments may be described as a process that is depicted as a flowchart, a flow diagram, a data flow diagram, a structure diagram, or a block diagram. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be re-arranged. A process is terminated when its operations are completed but could have additional steps not included in a figure. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a function, its termination can correspond to a return of the function to the calling function or the main function.

The word "exemplary" and/or "demonstrative" is used herein to mean serving as an example, instance, or illustration. For the avoidance of doubt, the subject matter disclosed herein is not limited by such examples. In addition, any aspect or design described herein as "exemplary" and/or "demonstrative" is not necessarily to be construed as preferred or advantageous over other aspects or designs, nor is it meant to preclude equivalent exemplary structures and techniques known to those of ordinary skill in the art. Furthermore, to the extent that the terms "includes," "has," "contains," and other similar words are used in either the detailed description or the claims, such terms are intended to be inclusive in a manner similar to the term "comprising" as an open transition word without precluding any additional or other elements.

Reference throughout this specification to "one embodiment" or "an embodiment" or "an instance" or "one instance" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.

The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used herein, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.

The present invention relates to a novel neural network architecture specifically designed to optimize deep learning models for healthcare diagnostics. It effectively handles multi-modal healthcare data, including medical imaging (e.g., X-rays, MRIs, CT scans), structured clinical data (e.g., patient demographics, medical history, test results), and unstructured data (e.g., clinical notes, physicians' reports). The architecture is capable of learning and identifying patterns across these diverse data sources, providing more comprehensive and accurate diagnostic insights.

The neural network architecture includes a flexible input layer capable of receiving and processing different data modalities. For medical imaging, convolutional neural network (CNN) layers are employed to extract spatial features. For structured clinical data, fully connected layers are used to process the numerical and categorical information. Unstructured data, such as free-text clinical notes, are processed using natural language processing (NLP) techniques, including recurrent neural networks (RNNs) or transformer models. These specialized layers ensure that each data modality is processed appropriately before being integrated into a unified feature vector.

Once the data has been processed through their respective layers, the outputs are passed to a fusion layer. The fusion layer combines the extracted features from each modality into a single, unified feature vector. The fusion process ensures that all relevant information, regardless of data type, contributes to the diagnostic prediction. This layer can employ various strategies such as concatenation or more sophisticated multi-attention mechanisms to create a holistic representation of the patient's health profile.

After feature fusion, the unified vector is passed through the core processing network, which is composed of a series of fully connected layers and specialized deep learning modules. Depending on the application, this network can include CNN layers for image-related tasks, RNN layers for time-series data (e.g., ECG or longitudinal patient data), or transformer layers for sequential and context-based data analysis. The network is designed to learn complex patterns and dependencies between the different features, improving the model's diagnostic capabilities.

The attention mechanism is integrated into the model to dynamically assign weight to different features based on their importance in the context of the specific diagnostic task. For example, in medical imaging, attention can highlight regions in an image that are indicative of disease, while in structured clinical data, it can prioritize certain test results or patient demographics. This ensures that the model places greater emphasis on the most relevant data, improving both prediction accuracy and model interpretability.

One of the key features of the invention is the explainability component. The explainability layer provides visual and textual explanations for the model's predictions. This is crucial for healthcare applications, where clinicians need to understand why a model made a particular decision. The attention mechanism also contributes to this by generating heatmaps or feature importance graphs that visually demonstrate which parts of the input data influenced the decision-making process.

The neural network architecture is optimized to ensure computational efficiency, allowing it to run on devices with varying computational capabilities. Techniques such as model pruning, quantization, and the use of transfer learning from pre-trained models are employed to reduce the model's size and computation time, making it suitable for deployment in resource-constrained healthcare environments.

In one embodiment, the invention is applied to the detection of lung cancer. The system receives multi-modal data from various sources: chest X-ray images, structured data such as patient age, smoking history, and lab test results, and unstructured data such as physician's notes. The chest X-ray images are processed through convolutional layers to extract spatial features, while structured data are processed through fully connected layers. Unstructured text from clinical notes is processed using an RNN model to extract relevant information regarding patient symptoms and medical history.

The features from each data modality are fused into a unified vector and passed through the core processing network, where a diagnostic prediction is made, indicating whether the patient shows signs of lung cancer. The attention mechanism highlights areas of the X-ray image that contributed most to the diagnosis, and the explainability layer generates a visual explanation of the model's decision-making process, showing the regions in the image that were most relevant.

In another embodiment, the neural network architecture is used for heart disease risk prediction. The system integrates data from electrocardiogram (ECG) signals, patient demographics (age, weight, cholesterol levels), and patient medical history (previous heart attacks, family history). The ECG data is processed through CNN layers to capture temporal and spatial features, while structured data like cholesterol levels and weight are processed through fully connected layers. The unstructured data in the form of medical history is processed using transformer models. The feature vectors from each modality are combined in a fusion layer, and the core network performs risk prediction to assess the likelihood of heart disease. An attention mechanism is used to prioritize certain patient characteristics (e.g., high cholesterol, age), while the explainability layer provides clinicians with a rationale for the model's predictions, ensuring transparency and trust in the system's output.

While considerable emphasis has been placed herein on the preferred embodiments, it will be appreciated that many embodiments can be made and that many changes can be made in the preferred embodiments without departing from the principles of the invention. These and other changes in the preferred embodiments of the invention will be apparent to those skilled in the art from the disclosure herein, whereby it is to be distinctly understood that the foregoing descriptive matter to be implemented merely as illustrative of the invention and not as limitation. , Claims:1.A neural network architecture for healthcare diagnostics, comprising:
An input layer configured to receive multi-modal data, including medical imaging, structured clinical data, and unstructured data;
A preprocessing module comprising one or more feature extraction components for extracting features from said multi-modal data;
A fusion layer configured to combine the extracted features into a unified feature vector;
A core processing network configured to process the unified feature vector and generate diagnostic predictions; and
An attention mechanism that selectively emphasizes relevant features of the data based on their importance to the diagnostic task.

2.The neural network architecture of claim 1, wherein said preprocessing module includes convolutional layers for processing medical imaging data.

3.The neural network architecture of claim 1, wherein said preprocessing module includes recurrent neural network layers for processing sequential patient data, such as time-series signals.

4.The neural network architecture of claim 1, wherein said fusion layer employs a multi-channel fusion strategy to integrate data from different sources.

5.The neural network architecture of claim 1, wherein said core processing network includes a combination of fully connected layers and one or more of convolutional, recurrent, or transformer layers.

6.The neural network architecture of claim 1, further comprising an explainability layer that generates visual or textual explanations of the diagnostic predictions.

7.The neural network architecture of claim 1, wherein the diagnostic predictions comprise disease classification, risk scoring, or prognosis prediction.

Documents

NameDate
202441088216-COMPLETE SPECIFICATION [14-11-2024(online)].pdf14/11/2024
202441088216-DECLARATION OF INVENTORSHIP (FORM 5) [14-11-2024(online)].pdf14/11/2024
202441088216-DRAWINGS [14-11-2024(online)].pdf14/11/2024
202441088216-FORM 1 [14-11-2024(online)].pdf14/11/2024
202441088216-FORM-9 [14-11-2024(online)].pdf14/11/2024
202441088216-REQUEST FOR EARLY PUBLICATION(FORM-9) [14-11-2024(online)].pdf14/11/2024

footer-service

By continuing past this page, you agree to our Terms of Service,Cookie PolicyPrivacy Policy  and  Refund Policy  © - Uber9 Business Process Services Private Limited. All rights reserved.

Uber9 Business Process Services Private Limited, CIN - U74900TN2014PTC098414, GSTIN - 33AABCU7650C1ZM, Registered Office Address - F-97, Newry Shreya Apartments Anna Nagar East, Chennai, Tamil Nadu 600102, India.

Please note that we are a facilitating platform enabling access to reliable professionals. We are not a law firm and do not provide legal services ourselves. The information on this website is for the purpose of knowledge only and should not be relied upon as legal advice or opinion.