Consult an Expert
Trademark
Design Registration
Consult an Expert
Trademark
Copyright
Patent
Infringement
Design Registration
More
Consult an Expert
Consult an Expert
Trademark
Design Registration
Login
INTELLIGENT GESTURE CLASSIFICATION SYSTEM USING TEMPORAL CONVOLUTIONAL NETWORKS (TCN) AND BIDIRECTIONAL LSTM FOR ENHANCED SPATIOTEMPORAL FEATURE EXTRACTION
Extensive patent search conducted by a registered patent agent
Patent search done by experts in under 48hrs
₹999
₹399
Abstract
Information
Inventors
Applicants
Specification
Documents
ORDINARY APPLICATION
Published
Filed on 28 October 2024
Abstract
The present invention relates to a comprehensive Gesture Recognition system designed to enhance human-computer interaction through the accurate identification of hand gestures. The system encompasses a robust data collection module that gathers raw gesture data from various sensors, including motion capture devices, accelerometers, and video-based sources. Following data collection, the system implements a preprocessing unit that performs essential tasks such as data cleaning, normalization, and augmentation, ensuring high-quality input for further analysis. The feature extraction process employs advanced dimensionality reduction techniques, including Principal Component Analysis (PCA), to refine the dataset and improve classification accuracy. The gesture classification model is equipped with two parallel pathways: a Bidirectional Long Short-Term Memory (Bi-LSTM) network and a Temporal Convolutional Network (TCN). These pathways are designed to extract spatiotemporal features from the gesture data effectively. The outputs from both networks are fused to provide a holistic representation of the gesture, enabling the system to accurately predict predefined gesture classes. This innovative approach not only captures the intricate details of human gestures but also facilitates real-time interaction with various digital platforms.
Patent Information
Application ID | 202441082240 |
Invention Field | COMPUTER SCIENCE |
Date of Application | 28/10/2024 |
Publication Number | 46/2024 |
Inventors
Name | Address | Country | Nationality |
---|---|---|---|
Chempavathy B | Senior Assistant Professor/ Department of Computer Science and Engineering, New Horizon College of Engineering, New Horizon Knowledge Park,Outer Ring Road, Near Marathalli, Bellandur(P), Bangalore-560103 | India | India |
D.Roja Ramani | Associate Professor/ Department of Computer Science and Engineering, New Horizon College of Engineering, New Horizon Knowledge Park,Outer Ring Road, Near Marathalli, Bellandur(P), Bangalore-560103 | India | India |
Senthil Anandhi A | Senior Assistant Professor/ Department of Computer Science and Engineering, New Horizon College of Engineering, New Horizon Knowledge Park, Outer Ring Road, Near Marathalli, Bellandur(P), Bangalore-560103 | India | India |
Ms. Srividhya Ganesan | Senior Assistant Professor/ Department of Computer Science and Engineering, New Horizon College of Engineering, New Horizon Knowledge Park, Outer Ring Road, Near Marathalli, Bellandur(P), Bangalore-560103 | India | India |
Ms. Subhashree Rath | Senior Assistant Professor/ Department of Computer Science and Engineering, New Horizon College of Engineering, New Horizon Knowledge Park, Outer Ring Road, Near Marathalli, Bellandur(P), Bangalore-560103 | India | India |
Dr. B. Rajalakshmi | Professor/ Department of Computer Science and Engineering, New Horizon College of Engineering, New Horizon Knowledge Park, Outer Ring Road, Near Marathalli, Bellandur(P), Bangalore-560103 | India | India |
Applicants
Name | Address | Country | Nationality |
---|---|---|---|
NEW HORIZON COLLEGE OF ENGINEERING | New Horizon College of Engineering, New Horizon Knowledge Park, Outer Ring Road, Near Marathalli, Bellandur(P), Bangalore-560103, Karnataka | India | India |
Specification
Description:The present invention relates to a comprehensive system for Gesture Recognition (100), designed to automate the process of identifying and classifying gestures through advanced machine learning techniques. The architecture, illustrated in Figure 1 (200), details the key components of the gesture recognition process. It begins with Data Collection (101), where gesture data is acquired from diverse sources, such as motion capture systems, accelerometers, and video-based sensors. This phase encompasses various types of gestures, including hand movements, facial expressions, and body postures, providing a rich dataset for analysis. Following data collection, the system undergoes Data Cleaning (102), which enhances data quality by removing noise and handling missing values to ensure only valid gesture information is processed. , C , Claims:1. 1. A design of a Gesture Recognition system (100) for enhancing human-computer interaction, comprising:
i. A data collection module (101) that gathers raw gesture data from sensors, including motion capture devices, accelerometers, and video-based sources.
ii. A preprocessing unit (102) configured to clean and normalize the collected data, including steps for data augmentation and segmentation;
A feature extraction mechanism (103) that utilizes dimensionality reduction techniques to enhance gesture recognition accuracy.
iii. A design of a Gesture Recognition system (100) as claimed in claim 1, wherein the preprocessing unit employs Principal Component Analysis (PCA) (201) to standardize the data, ensuring a mean of 0 and a variance of 1 (202), and calculates the covariance matrix (203) of the features.
2. A design of a Gesture Recognition system (100) as claimed in claim 2, wherein the feature extraction mechanism selects principal components (204) based on their eigenvalues, sorting them by the variance they capture and retaining the top k components for further analysis.
3. A design of a Gesture Recognition system (100) as claimed in claim 3, wherein the gesture classification model (105) employs two parallel pathways: one utilizing a Bidirectional Long Short-Term Memory (Bi-LSTM) network (301) and the other utilizing a Temporal Convolutional Network (TCN) (302) for spatiotemporal feature extraction.
4. A design of a Gesture Recognition system (100) as claimed in claim 4, wherein the outputs from both the Bi-LSTM and TCN pathways are fused (303) to produce a final classification (304) that predicts the gesture class, incorporating both spatial (306) and temporal (307) analysis dimensions.
Documents
Name | Date |
---|---|
202441082240-FORM-9 [07-11-2024(online)].pdf | 07/11/2024 |
202441082240-COMPLETE SPECIFICATION [28-10-2024(online)].pdf | 28/10/2024 |
202441082240-DRAWINGS [28-10-2024(online)].pdf | 28/10/2024 |
Talk To Experts
Calculators
Downloads
By continuing past this page, you agree to our Terms of Service,, Cookie Policy, Privacy Policy and Refund Policy © - Uber9 Business Process Services Private Limited. All rights reserved.
Uber9 Business Process Services Private Limited, CIN - U74900TN2014PTC098414, GSTIN - 33AABCU7650C1ZM, Registered Office Address - F-97, Newry Shreya Apartments Anna Nagar East, Chennai, Tamil Nadu 600102, India.
Please note that we are a facilitating platform enabling access to reliable professionals. We are not a law firm and do not provide legal services ourselves. The information on this website is for the purpose of knowledge only and should not be relied upon as legal advice or opinion.