image
image
user-login
Patent search/

A MACHINE LEARNING-DRIVEN PROCESS FOR AUTOMATION IN ADAPTIVE AND IMMERSIVE VIRTUAL REALITY ENVIRONMENTS

search

Patent Search in India

  • tick

    Extensive patent search conducted by a registered patent agent

  • tick

    Patent search done by experts in under 48hrs

₹999

₹399

Talk to expert

A MACHINE LEARNING-DRIVEN PROCESS FOR AUTOMATION IN ADAPTIVE AND IMMERSIVE VIRTUAL REALITY ENVIRONMENTS

ORDINARY APPLICATION

Published

date

Filed on 23 November 2024

Abstract

The present invention relates to a system and method for enhancing virtual reality (VR) interactions through intelligent automation powered by machine learning (ML). The system analyzes real-time user interaction data, including hand gestures, body movements, gaze direction, and voice inputs, to predict user preferences and behaviors. Using this analysis, the system dynamically adjusts virtual objects, user interfaces, and environmental attributes, such as lighting and sound, to optimize the immersive experience. Key features include automated object manipulation, personalized assistance via virtual agents, and reinforcement learning to improve adaptability over time. The invention is compatible with a range of VR hardware and facilitates a seamless, intuitive, and personalized user experience.

Patent Information

Application ID202431091326
Invention FieldCOMPUTER SCIENCE
Date of Application23/11/2024
Publication Number48/2024

Inventors

NameAddressCountryNationality
Dr. Ranjit BaruaS/o. Mr. TusharKanti Barua, Assistant Professor, Department of Mechanical Engineering, Om Dayal Group of Institutions, Uluberia, Howrah - 711316, West Bengal, India.IndiaIndia
Susmita BiswasD/o. Mr. Samaranjan Biswas, Associate Professor, Department of Cyber Science & Technology, Brainware University, 398, Ramkrishnapur Road, Near Jagadighata Market, Barasat, Kolkata - 700125, West Bengal, India.IndiaIndia
Dr. Sunil Kumar TiwariS/o. Mr. Gulab Prasad Tiwari, Associate Professor, Department of Industrial & Production Engineering, Institute of Engineering and Rural Technology, Engineering Degree Division, 26, Chaitham Lines, Prayagraj - 211002, Uttar Pradesh, India.IndiaIndia
Dr. Himanshu MishraS/o. Mr. Satya Narayan Mishra, Assistant Professor, Department of Industrial & Production Engineering, Institute of Engineering and Rural Technology, Engineering Degree Division, 26, Chaitham Lines, Prayagraj - 211002, Uttar Pradesh, India.IndiaIndia
Dr. Krishna Chandra MishraS/o. Mr. Daya Ram Mishra, Associate Professor, Department of Applied Sciences and Humanities, United College of Engineering and Research, Naini, Prayagraj - 211010, Uttar Pradesh, India.IndiaIndia
Dr. Dhruv Kant RahiS/o. Mr. Rajju Ram, Assistant Professor, Department of Industrial & Production Engineering, Institute of Engineering and Rural Technology, Engineering Degree Division, 26, Chaitham Lines, Prayagraj - 211002, Uttar Pradesh, India.IndiaIndia
Subhajit RoyS/o. Late. B. K. Roy, Research Scholar, Department of Electrical Engineering, National Institute of Technology, Silchar, and Ex – Head of the Department BSC at ISOAH Kolkata, Fakiratilla, Silchar - 788010, Assam, India.IndiaIndia
Anju Neelam BhagatD/o. Dr. Ram Kishor Bhagat, Lecturer, Department of Computer Science and Engineering, Government Women’s Polytechnic Ranchi, Tharpakhna, Ranchi - 834001, Jharkhand, India.IndiaIndia
Nishant KumarS/o. Mr. Krishna Kumar Baitha, Research Scholar, Department of Mechanical Engineering, Birsa Institute of Technology, Sindri, Dhanbad - 828122, Jharkhand, India.IndiaIndia
Gurpreet SinghS/o. Mr. Jasbir Singh, Student, Department of Information Technology, Haldia Institute of Technology, Haldia, Paschim Bardhman - 713325, West Bengal, India.IndiaIndia
Saqlain Zarjis AnsariS/o. Mr. M.D Hadis, Student, Department of Information Technology, Haldia Institute of Technology, Haldia, Paschim Bardhman - 713325, West Bengal, India.IndiaIndia

Applicants

NameAddressCountryNationality
Dr. Ranjit BaruaS/o. Mr. TusharKanti Barua, Assistant Professor, Department of Mechanical Engineering, Om Dayal Group of Institutions, Uluberia, Howrah - 711316, West Bengal, India.IndiaIndia
Susmita BiswasD/o. Mr. Samaranjan Biswas, Associate Professor, Department of Cyber Science & Technology, Brainware University, 398, Ramkrishnapur Road, Near Jagadighata Market, Barasat, Kolkata - 700125, West Bengal, India.IndiaIndia
Dr. Sunil Kumar TiwariS/o. Mr. Gulab Prasad Tiwari, Associate Professor, Department of Industrial & Production Engineering, Institute of Engineering and Rural Technology, Engineering Degree Division, 26, Chaitham Lines, Prayagraj - 211002, Uttar Pradesh, India.IndiaIndia
Dr. Himanshu MishraS/o. Mr. Satya Narayan Mishra, Assistant Professor, Department of Industrial & Production Engineering, Institute of Engineering and Rural Technology, Engineering Degree Division, 26, Chaitham Lines, Prayagraj - 211002, Uttar Pradesh, India.IndiaIndia
Dr. Krishna Chandra MishraS/o. Mr. Daya Ram Mishra, Associate Professor, Department of Applied Sciences and Humanities, United College of Engineering and Research, Naini, Prayagraj - 211010, Uttar Pradesh, India.IndiaIndia
Dr. Dhruv Kant RahiS/o. Mr. Rajju Ram, Assistant Professor, Department of Industrial & Production Engineering, Institute of Engineering and Rural Technology, Engineering Degree Division, 26, Chaitham Lines, Prayagraj - 211002, Uttar Pradesh, India.IndiaIndia
Subhajit RoyS/o. Late. B. K. Roy, Research Scholar, Department of Electrical Engineering, National Institute of Technology, Silchar, and Ex – Head of the Department BSC at ISOAH Kolkata, Fakiratilla, Silchar - 788010, Assam, India.IndiaIndia
Anju Neelam BhagatD/o. Dr. Ram Kishor Bhagat, Lecturer, Department of Computer Science and Engineering, Government Women’s Polytechnic Ranchi, Tharpakhna, Ranchi - 834001, Jharkhand, India.IndiaIndia
Nishant KumarS/o. Mr. Krishna Kumar Baitha, Research Scholar, Department of Mechanical Engineering, Birsa Institute of Technology, Sindri, Dhanbad - 828122, Jharkhand, India.IndiaIndia
Gurpreet SinghS/o. Mr. Jasbir Singh, Student, Department of Information Technology, Haldia Institute of Technology, Haldia, Paschim Bardhman - 713325, West Bengal, India.IndiaIndia
Saqlain Zarjis AnsariS/o. Mr. M.D Hadis, Student, Department of Information Technology, Haldia Institute of Technology, Haldia, Paschim Bardhman - 713325, West Bengal, India.IndiaIndia

Specification

Description:[0021].The following description provides specific details of certain aspects of the disclosure illustrated in the drawings to provide a thorough understanding of those aspects. It should be recognized, however, that the present disclosure can be reflected in additional aspects and the disclosure may be practiced without some of the details in the following description.
[0022].The various aspects including the example aspects are now described more fully with reference to the accompanying drawings, in which the various aspects of the disclosure are shown. The disclosure may, however, be embodied in different forms and should not be construed as limited to the aspects set forth herein. Rather, these aspects are provided so that this disclosure is thorough and complete, and fully conveys the scope of the disclosure to those skilled in the art. In the drawings, the sizes of components may be exaggerated for clarity.
[0023].It is understood that when an element or layer is referred to as being "on," "connected to," or "coupled to" another element or layer, it can be directly on, connected to, or coupled to the other element or layer or intervening elements or layers that may be present. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
[0024].The subject matter of example aspects, as disclosed herein, is described with specificity to meet statutory requirements. However, the description itself is not intended to limit the scope of this patent. Rather, the inventor/inventors have contemplated that the claimed subject matter might also be embodied in other ways, to include different features or combinations of features similar to the ones described in this document, in conjunction with other technologies.
[0025].The present invention relates to a system and method for enhancing virtual reality (VR) interactions through intelligent automation powered by machine learning (ML). The system analyzes real-time user interaction data, including hand gestures, body movements, gaze direction, and voice inputs, to predict user preferences and behaviors. Using this analysis, the system dynamically adjusts virtual objects, user interfaces, and environmental attributes, such as lighting and sound, to optimize the immersive experience. Key features include automated object manipulation, personalized assistance via virtual agents, and reinforcement learning to improve adaptability over time. The invention is compatible with a range of VR hardware and facilitates a seamless, intuitive, and personalized user experience.
[0026].Virtual reality (VR) technology has significantly advanced, enabling users to engage in immersive environments for entertainment, education, training, and other applications. However, current VR systems often fall short in delivering fully personalized and adaptive experiences. These systems typically rely on manual adjustments or pre-programmed settings, limiting their ability to respond dynamically to individual user behaviors and preferences. This lack of adaptability detracts from user engagement and immersion, making interactions less intuitive and efficient.
[0027].Machine learning (ML) has emerged as a powerful tool capable of addressing these limitations. By analyzing large datasets and learning from user interactions in real-time, ML algorithms can provide predictive insights and automate decision-making processes. When integrated into VR systems, ML can transform static virtual environments into adaptive, intelligent spaces that cater to the unique needs of each user.
[0028].The integration of ML in VR enables automated customization of user interfaces, object interactions, and environmental settings. For instance, by analyzing gaze direction, hand gestures, or voice commands, an ML-powered VR system can predict user intentions and dynamically adjust the virtual environment to enhance usability and immersion.
[0029].Despite its potential, the application of ML in VR environments remains underexplored, particularly in terms of real-time adaptability and personalization. The present invention addresses this gap by providing a system and method that uses ML to optimize VR interactions. This approach not only enhances user engagement but also enables more intuitive, seamless, and immersive experiences.
[0030].The present invention provides a system and method for integrating machine learning (ML) into virtual reality (VR) environments to enhance user interactions through real-time automation and personalization. The system captures and analyzes user input data-such as hand gestures, body movements, gaze direction, and voice commands-using a combination of sensors embedded in VR hardware. This data is processed by a machine learning module, which uses advanced algorithms to predict user preferences, behavior, and intentions. Based on these predictions, the system dynamically adjusts virtual objects, environmental factors, and user interfaces, creating a seamless and intuitive VR experience.
[0031].The invention comprises several key components a data collection layer, a machine learning module, an automation engine, and a VR interaction layer. The data collection layer is responsible for capturing real-time user interaction data via motion tracking devices, gaze sensors, haptic feedback systems, and other input peripherals. This information is transmitted to the machine learning module, which employs supervised learning, reinforcement learning, and transfer learning techniques to interpret the data. Supervised learning trains the system to recognize standard interaction patterns, reinforcement learning refines its adaptability through trial and error, and transfer learning accelerates customization for new users by applying generalizable insights from prior interactions.
[0032].The processed data is fed into the automation engine, which serves as the decision-making core of the system. This engine determines how to adjust the VR environment to align with the user's preferences and enhance their experience. For instance, if a user frequently interacts with objects in a particular way, the system can anticipate these actions and streamline object manipulation by repositioning items or automating common tasks. Additionally, the engine can optimize environmental settings, such as adjusting lighting or ambient sound, to match user preferences without requiring manual intervention.
[0033].The VR interaction layer is where these adjustments are rendered, creating an adaptive and personalized virtual environment. This layer ensures that changes to the environment occur in real-time, maintaining a seamless experience for the user. For example, if the system detects that the user is struggling to complete a task, it may provide contextual assistance through a virtual guide or adjust the difficulty level of the interaction. Similarly, if a user consistently prefers certain types of environments-such as brighter lighting or quieter soundscapes-the system will automatically configure these settings to enhance comfort and immersion.
[0034].One notable feature of the invention is its ability to provide context-aware assistance through virtual agents. These agents leverage the machine learning module to detect when the user encounters challenges and offer guidance or perform actions on their behalf. This reduces user frustration and improves task efficiency, particularly in complex or unfamiliar VR environments. Another significant feature is adaptive user interface management, which prioritizes frequently used tools or options, bringing them closer to the user for easier access.
[0035].The invention is designed to be compatible with a wide range of VR platforms, including tethered systems like HTC Vive and Oculus Rift, standalone systems such as Meta Quest, and peripherals like motion controllers and haptic gloves. This broad compatibility ensures that the system can be seamlessly integrated into existing VR ecosystems, making it versatile and accessible for various applications.
[0036].The overall process flow of the invention begins with real-time data collection, followed by behavioral analysis performed by the machine learning module. The automation engine uses the insights gained to adjust the VR environment dynamically, with continuous feedback loops enabling ongoing refinement and learning. Over time, the system becomes more accurate in predicting user preferences and automating interactions, thereby enhancing the user experience.
[0037].This invention transforms VR from a static, manual interface into a dynamic, user-centric system that adapts to individual needs in real time. By integrating ML into VR, the invention enables more immersive, efficient, and intuitive virtual experiences, paving the way for applications in gaming, education, healthcare, industrial training, and beyond.
[0038].The invention described herein represents a significant advancement in the integration of machine learning (ML) and virtual reality (VR) technologies. By leveraging real-time user interaction data and employing advanced ML techniques, the system provides an adaptive, intelligent, and personalized VR experience. It dynamically adjusts virtual objects, environments, and user interfaces, offering seamless automation and enhanced usability.
[0039].The ability of the system to predict user behavior, learn from feedback, and adapt over time ensures a more immersive and intuitive interaction. Features such as dynamic object manipulation, context-aware assistance, and environmental customization set this invention apart from traditional, static VR systems. Additionally, its compatibility with diverse hardware platforms broadens its applicability across industries, including entertainment, education, healthcare, and industrial training.
[0040].By transforming VR into an adaptive and user-centric platform, this invention overcomes many of the limitations of current VR systems. It not only improves immersion and efficiency but also expands the scope of VR applications to meet the needs of a wider audience. This innovation paves the way for the next generation of VR systems, where environments evolve intelligently in response to individual user preferences, delivering unparalleled levels of engagement and personalization. , Claims:1.A system for machine learning-driven automation in a virtual reality (VR) environment, comprising:
a) A machine learning module configured to analyze user interactions in real-time;
b) An automation engine that adjusts VR elements based on the analysis; and
c) A VR interaction layer for rendering an adaptive and personalized virtual environment.
2.The system as claimed in Claim 1, wherein the machine learning module is trained to predict user preferences using real-time data collected from sensors, including but not limited to motion tracking, hand gestures, gaze direction, and voice inputs.
3.The system as claimed in Claim 1, wherein the automation engine dynamically adjusts virtual objects, environmental settings (e.g., lighting, sound), and user interface elements to enhance user interaction and immersion.
4.The system as claimed in Claim 1, further comprising a context-aware virtual assistant configured to provide adaptive guidance and perform automated actions based on user interaction patterns.
5.The system as claimed in Claim 1, wherein the machine learning module uses reinforcement learning to improve prediction accuracy and interaction automation through iterative feedback.
6.A method for enhancing user interaction in a virtual reality environment, comprising:
a) Collecting real-time user interaction data using a combination of sensors embedded in VR devices;
b) Processing the collected data using a machine learning module to identify user behavior patterns and preferences;
c) Automatically adjusting the virtual environment, including object placement, interface layout, and environmental attributes, based on the processed data.
7.The method as claimed in Claim 6, wherein the adjustments to the VR environment include predictive modifications to object interactions, such as repositioning objects or automating user-selected actions.
8.The method as claimed in Claim 6, further comprising dynamically adapting the virtual environment based on continuous feedback from user actions and preferences detected during interaction.

Documents

NameDate
202431091326-FORM-26 [25-11-2024(online)].pdf25/11/2024
202431091326-COMPLETE SPECIFICATION [23-11-2024(online)].pdf23/11/2024
202431091326-DRAWINGS [23-11-2024(online)].pdf23/11/2024
202431091326-ENDORSEMENT BY INVENTORS [23-11-2024(online)].pdf23/11/2024
202431091326-FORM 1 [23-11-2024(online)].pdf23/11/2024
202431091326-FORM 3 [23-11-2024(online)].pdf23/11/2024
202431091326-FORM-5 [23-11-2024(online)].pdf23/11/2024
202431091326-FORM-9 [23-11-2024(online)].pdf23/11/2024

footer-service

By continuing past this page, you agree to our Terms of Service,Cookie PolicyPrivacy Policy  and  Refund Policy  © - Uber9 Business Process Services Private Limited. All rights reserved.

Uber9 Business Process Services Private Limited, CIN - U74900TN2014PTC098414, GSTIN - 33AABCU7650C1ZM, Registered Office Address - F-97, Newry Shreya Apartments Anna Nagar East, Chennai, Tamil Nadu 600102, India.

Please note that we are a facilitating platform enabling access to reliable professionals. We are not a law firm and do not provide legal services ourselves. The information on this website is for the purpose of knowledge only and should not be relied upon as legal advice or opinion.