Consult an Expert
Trademark
Design Registration
Consult an Expert
Trademark
Copyright
Patent
Infringement
Design Registration
More
Consult an Expert
Consult an Expert
Trademark
Design Registration
Login
MODEL FOR DRIVER EMOTION AND HEALTH METRICS ASSESSMENT IN ADVANCED DRIVER ASSISTANCE SYSTEM
Extensive patent search conducted by a registered patent agent
Patent search done by experts in under 48hrs
₹999
₹399
Abstract
Information
Inventors
Applicants
Specification
Documents
ORDINARY APPLICATION
Published
Filed on 12 November 2024
Abstract
ABSTRACT “MODEL FOR DRIVER EMOTION AND HEALTH METRICS ASSESSMENT IN ADVANCED DRIVER ASSISTANCE SYSTEM” The present invention relates to a multimodal driver emotion and health metrics assessment system for advanced driver assistance systems (ADAS). The system integrates data from various sensors, including facial expression cameras, speech emotion detection microphones, heart rate sensors, and blood pressure sensors, to assess the driver's emotional and physiological state in real-time. A deep neural network model processes this data to accurately classify the driver’s emotions, enabling dynamic adaptation of driving parameters to enhance safety and personalization. The system provides timely responses to emotional states such as stress or fatigue, improving driver safety. It can be implemented in autonomous vehicles, ride-sharing services, fleet management, and urban planning applications. The system offers a comprehensive approach to emotion detection and health monitoring, contributing to safer and more personalized driving experiences, with potential for broader applications in vehicle technology and smart transportation solutions. Figure 1
Patent Information
Application ID | 202431087397 |
Invention Field | COMPUTER SCIENCE |
Date of Application | 12/11/2024 |
Publication Number | 47/2024 |
Inventors
Name | Address | Country | Nationality |
---|---|---|---|
Sushruta Mishra | School of Computer Engineering, Kalinga Institute of Industrial Technology (Deemed to be University), Patia Bhubaneswar Odisha India 751024 | India | India |
Hrudaya Kumar Tripathy | School of Computer Engineering, Kalinga Institute of Industrial Technology (Deemed to be University), Patia Bhubaneswar Odisha India 751024 | India | India |
Srijan Saha | School of Computer Engineering, Kalinga Institute of Industrial Technology (Deemed to be University), Patia Bhubaneswar Odisha India 751024 | India | India |
Applicants
Name | Address | Country | Nationality |
---|---|---|---|
Kalinga Institute of Industrial Technology (Deemed to be University) | Patia Bhubaneswar Odisha India 751024 | India | India |
Specification
Description:TECHNICAL FIELD
[0001] The present invention relates to the field of artificial intelligence and automated systems, and more particularly, the present invention relates to the model for driver emotion and health metrics assessment in advanced driver assistance system.
BACKGROUND ART
[0002] The following discussion of the background of the invention is intended to facilitate an understanding of the present invention. However, it should be appreciated that the discussion is not an acknowledgment or admission that any of the material referred to was published, known, or part of the common general knowledge in any jurisdiction as of the application's priority date. The details provided herein the background if belongs to any publication is taken only as a reference for describing the problems, in general terminologies or principles or both of science and technology in the associated prior art.
[0003] Real-time Emotion Detection: Accurately recognizing the driver's emotional state in real-time is crucial for advanced driver assistance systems (ADAS). Enhancing Driver Safety: Understanding and responding to driver emotions can help mitigate risks and improve safety in autonomous vehicles. Personalized Driving Experience: Adapting driving parameters based on individual emotional states can create a more comfortable and personalized driving experience. Robust Assessment: Addressing challenges in real-time emotion detection in varying environmental conditions using multiple modalities (facial expressions, voice patterns, physiological signals).
[0004] Lack of Real-time Emotion Detection: Many existing ADAS lack robust real-time emotion detection capabilities, limiting their ability to respond to driver emotions effectively. Insufficient Integration of Multimodal Data: Current systems may not effectively integrate data from various sources, such as facial expressions, voice, and physiological signals, for comprehensive emotion assessment. Limited Personalization: Existing ADAS may not offer personalized driving experiences tailored to individual driver's emotional states. Vulnerability to Varying Conditions: Emotion detection accuracy can be affected by varying environmental conditions, such as lighting and noise levels.
[0005] Prior arts known:
[0006] Limited Single-Modality Systems: Some ADAS may use single-modality emotion detection, such as facial recognition or voice analysis, but these lack the comprehensiveness of a multimodal approach.
[0007] Basic Emotion Detection: Existing systems may offer basic emotion detection but lack the robustness and accuracy of the proposed invention, especially in varying conditions.
[0008] Limited Personalization: Current ADAS may not offer personalized driving experiences based on individual emotional states.
[0009] In light of the foregoing, there is a need for Model for driver emotion and health metrics assessment in advanced driver assistance system that overcomes problems prevalent in the prior art associated with the traditionally available method or system, of the above-mentioned inventions that can be used with the presented disclosed technique with or without modification.
[0010] All publications herein are incorporated by reference to the same extent as if each individual publication or patent application were specifically and individually indicated to be incorporated by reference. Where a definition or use of a term in an incorporated reference is inconsistent or contrary to the definition of that term provided herein, the definition of that term provided herein applies, and the definition of that term in the reference does not apply.
OBJECTS OF THE INVENTION
[0011] The principal object of the present invention is to overcome the disadvantages of the prior art by providing model for driver emotion and health metrics assessment in advanced driver assistance system.
[0012] Another object of the present invention is to provide model for driver emotion and health metrics assessment in advanced driver assistance system that uses a multimodal approach for more accurate emotion detection.
[0013] Another object of the present invention is to provide model for driver emotion and health metrics assessment in advanced driver assistance system that offers greater robustness and accuracy compared to basic emotion detection systems, especially in varying conditions.
[0014] Another object of the present invention is to provide model for driver emotion and health metrics assessment in advanced driver assistance system that provides a more personalized driving experience compared to existing ADAS.
[0015] The foregoing and other objects of the present invention will become readily apparent upon further review of the following detailed description of the embodiments as illustrated in the accompanying drawings.
SUMMARY OF THE INVENTION
[0016] The present invention relates to model for driver emotion and health metrics assessment in advanced driver assistance system.
[0017] Unique features of our solution include the following:
- Comprehensive Multimodal Approach: The invention integrates data from multiple modalities, including facial expressions, voice, heart rate, and blood pressure, for a more accurate and robust emotion assessment.
- Deep Neural Network Model: Employs a deep neural network model for advanced data analysis and emotion classification.
- Dynamic Adaptation: Offers dynamic adaptation of driving parameters based on individual emotional states, enhancing safety and personalization.
- Real-time Capability: Designed for real-time emotion detection, enabling timely responses to driver emotions.
[0018] While the invention has been described and shown with reference to the preferred embodiment, it will be apparent that variations might be possible that would fall within the scope of the present invention.
BRIEF DESCRIPTION OF DRAWINGS
[0019] So that the manner in which the above-recited features of the present invention can be understood in detail, a more particular description of the invention, briefly summarized above, may have been referred by embodiments, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only typical embodiments of this invention and are therefore not to be considered limiting of its scope, for the invention may admit to other equally effective embodiments.
[0020] These and other features, benefits, and advantages of the present invention will become apparent by reference to the following text figure, with like reference numbers referring to like structures across the views, wherein:
[0021] Figure 1 shows a program flow for model for driver emotion and health metrics assessment in advanced driver assistance system, in accordance with an exemplary embodiment of the present invention;
[0022] Figure 2 shows a proposed sentient model for driver's health and assessment.
DETAILED DESCRIPTION OF THE INVENTION
[0023] While the present invention is described herein by way of example using embodiments and illustrative drawings, those skilled in the art will recognize that the invention is not limited to the embodiments of drawing or drawings described and are not intended to represent the scale of the various components. Further, some components that may form a part of the invention may not be illustrated in certain figures, for ease of illustration, and such omissions do not limit the embodiments outlined in any way. It should be understood that the drawings and the detailed description thereto are not intended to limit the invention to the particular form disclosed, but on the contrary, the invention is to cover all modifications, equivalents, and alternatives falling within the scope of the present invention as defined by the appended claim.
[0024] As used throughout this description, the word "may" is used in a permissive sense (i.e. meaning having the potential to), rather than the mandatory sense, (i.e. meaning must). Further, the words "a" or "an" mean "at least one" and the word "plurality" means "one or more" unless otherwise mentioned. Furthermore, the terminology and phraseology used herein are solely used for descriptive purposes and should not be construed as limiting in scope. Language such as "including," "comprising," "having," "containing," or "involving," and variations thereof, is intended to be broad and encompass the subject matter listed thereafter, equivalents, and additional subject matter not recited, and is not intended to exclude other additives, components, integers, or steps. Likewise, the term "comprising" is considered synonymous with the terms "including" or "containing" for applicable legal purposes. Any discussion of documents, acts, materials, devices, articles, and the like are included in the specification solely for the purpose of providing a context for the present invention. It is not suggested or represented that any or all these matters form part of the prior art base or were common general knowledge in the field relevant to the present invention.
[0025] In this disclosure, whenever a composition or an element or a group of elements is preceded with the transitional phrase "comprising", it is understood that we also contemplate the same composition, element, or group of elements with transitional phrases "consisting of", "consisting", "selected from the group of consisting of, "including", or "is" preceding the recitation of the composition, element or group of elements and vice versa.
[0026] The present invention is described hereinafter by various embodiments with reference to the accompanying drawing, wherein reference numerals used in the accompanying drawing correspond to the like elements throughout the description. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiment set forth herein. Rather, the embodiment is provided so that this disclosure will be thorough and complete and will fully convey the scope of the invention to those skilled in the art. In the following detailed description, numeric values and ranges are provided for various aspects of the implementations described. These values and ranges are to be treated as examples only and are not intended to limit the scope of the claims. In addition, several materials are identified as suitable for various facets of the implementations. These materials are to be treated as exemplary and are not intended to limit the scope of the invention.
[0027] The present invention relates to model for driver emotion and health metrics assessment in advanced driver assistance system. Unique features of our solution include the following:
- Comprehensive Multimodal Approach: The invention integrates data from multiple modalities, including facial expressions, voice, heart rate, and blood pressure, for a more accurate and robust emotion assessment.
- Deep Neural Network Model: Employs a deep neural network model for advanced data analysis and emotion classification.
- Dynamic Adaptation: Offers dynamic adaptation of driving parameters based on individual emotional states, enhancing safety and personalization.
- Real-time Capability: Designed for real-time emotion detection, enabling timely responses to driver emotions.
[0028] The present invention relates to model for driver emotion and health metrics assessment in advanced driver assistance system. Multimodal Emotion Detection System: The invention proposes "Sentient," a comprehensive multimodal system that integrates real-time data from various sensors:
- Face emotion detection cameras
- Speech emotion detection microphones
- Heart rate sensors
- Blood pressure sensors
- Deep Neural Network Model: Employs a deep neural network model to analyze multimodal data and accurately assess driver's emotions.
- Dynamic Adaptation: The system dynamically adjusts driving parameters based on individual emotional states, creating a personalized driving experience.
- Robust Emotion Detection: Addresses challenges in real-time emotion detection by utilizing facial expressions, voice patterns, and physiological signals for robust assessments.
[0029] The present invention can be used in following applications:
- Autonomous Vehicle Manufacturers: Integration into autonomous vehicles to enhance safety and user experience.
- ADAS Developers: Development of advanced driver assistance systems with emotion detection capabilities.
- Ride-Sharing Companies: Implementation in ride-sharing vehicles to improve passenger safety and comfort.
- Fleet Management: Use in fleet management systems to monitor driver emotions and prevent accidents.
[0030] Various modifications to these embodiments are apparent to those skilled in the art from the description and the accompanying drawings. The principles associated with the various embodiments described herein may be applied to other embodiments. Therefore, the description is not intended to be limited to the 5 embodiments shown along with the accompanying drawings but is to be providing the broadest scope consistent with the principles and the novel and inventive features disclosed or suggested herein. Accordingly, the invention is anticipated to hold on to all other such alternatives, modifications, and variations that fall within the scope of the present invention and appended claims.
, Claims:CLAIMS
We Claim:
1) A multimodal driver emotion and health metrics assessment system for an advanced driver assistance system (ADAS), the system comprising:
- a facial expression detection camera configured to capture and analyze the driver's facial expressions in real-time;
- a speech emotion detection microphone for detecting voice patterns and vocal characteristics indicative of the driver's emotional state;
- a heart rate sensor for monitoring the driver's heart rate;
- a blood pressure sensor for measuring the driver's blood pressure;
- a deep neural network model that processes and analyzes data from the facial expression, speech, heart rate, and blood pressure sensors to assess the driver's emotional state and health metrics.
2) The system as claimed in claim 1, wherein the deep neural network model is trained to classify the driver's emotions based on the combined input from the facial expression, voice, heart rate, and blood pressure data, thereby providing an accurate and robust emotion assessment.
3) The system as claimed in claim 1, wherein the system further comprising a dynamic adaptation module that adjusts driving parameters, such as speed, navigation settings, or vehicle behavior, based on the driver's emotional state, to enhance safety and personalization of the driving experience.
4) The system as claimed in claim 1, wherein the facial expression detection camera, speech emotion detection microphone, heart rate sensor, and blood pressure sensor are synchronized and operate in real-time to provide continuous monitoring of the driver's emotional and physiological states.
Documents
Name | Date |
---|---|
202431087397-COMPLETE SPECIFICATION [12-11-2024(online)].pdf | 12/11/2024 |
202431087397-DECLARATION OF INVENTORSHIP (FORM 5) [12-11-2024(online)].pdf | 12/11/2024 |
202431087397-DRAWINGS [12-11-2024(online)].pdf | 12/11/2024 |
202431087397-EDUCATIONAL INSTITUTION(S) [12-11-2024(online)].pdf | 12/11/2024 |
202431087397-EVIDENCE FOR REGISTRATION UNDER SSI [12-11-2024(online)].pdf | 12/11/2024 |
202431087397-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [12-11-2024(online)].pdf | 12/11/2024 |
202431087397-FORM 1 [12-11-2024(online)].pdf | 12/11/2024 |
202431087397-FORM FOR SMALL ENTITY(FORM-28) [12-11-2024(online)].pdf | 12/11/2024 |
202431087397-FORM-9 [12-11-2024(online)].pdf | 12/11/2024 |
202431087397-POWER OF AUTHORITY [12-11-2024(online)].pdf | 12/11/2024 |
202431087397-REQUEST FOR EARLY PUBLICATION(FORM-9) [12-11-2024(online)].pdf | 12/11/2024 |
Talk To Experts
Calculators
Downloads
By continuing past this page, you agree to our Terms of Service,, Cookie Policy, Privacy Policy and Refund Policy © - Uber9 Business Process Services Private Limited. All rights reserved.
Uber9 Business Process Services Private Limited, CIN - U74900TN2014PTC098414, GSTIN - 33AABCU7650C1ZM, Registered Office Address - F-97, Newry Shreya Apartments Anna Nagar East, Chennai, Tamil Nadu 600102, India.
Please note that we are a facilitating platform enabling access to reliable professionals. We are not a law firm and do not provide legal services ourselves. The information on this website is for the purpose of knowledge only and should not be relied upon as legal advice or opinion.