image
image
user-login
Patent search/

A RIDER ASSISTANCE SYSTEM, A DEVICE, AND A METHOD THEREOF

search

Patent Search in India

  • tick

    Extensive patent search conducted by a registered patent agent

  • tick

    Patent search done by experts in under 48hrs

₹999

₹399

Talk to expert

A RIDER ASSISTANCE SYSTEM, A DEVICE, AND A METHOD THEREOF

ORDINARY APPLICATION

Published

date

Filed on 20 November 2024

Abstract

An embodiment of the present invention relates to a rider assistance system (RAS) (100) that generates recommendations to enhance user safety and navigation. The RAS (100) includes a vehicle body frame (101) equipped with a transceiver (104), one or more cameras (102) to capture images of a vehicle (600) and its surrounding environment, and one or more sensors (103) to detect obstacles and sense vehicle and surrounding parameters. A rider assistance device (RAD) (200) with a processor (201) retrieves, filters, classifies and analyzes sensed data to provide recommendations. The system further includes communication means (106) for data exchange, an adaptive cruise control (ACC) module (107) to adjust vehicle parameters, an augmented reality (AR) display (105) to show recommendations, a virtual reality (VR) module (108) for training, and a data storage module (109) to store data using blockchain technology. The system is adaptable for saddle-ride or straddle-type vehicles.

Patent Information

Application ID202441090277
Invention FieldMECHANICAL ENGINEERING
Date of Application20/11/2024
Publication Number48/2024

Inventors

NameAddressCountryNationality
RAJU PATELAssociate Professor Grade 1, School of Electronics Engineering, Vellore Institute of Technology, Chennai, Vandalur - Kelambakkam Road, Chennai, Tamil Nadu - 600127, India.IndiaIndia
NITISH KATALAssistant Professor Grade 2, School of Electronics Engineering, Vellore Institute of Technology, Chennai, Vandalur - Kelambakkam Road, Chennai, Tamil Nadu - 600127, India.IndiaIndia
UTKARSH KEDIAUG Student, School of Computer Science and Engineering, Vellore Institute of Technology, Chennai, Vandalur - Kelambakkam Road, Chennai, Tamil Nadu - 600127, India.IndiaIndia
SUMEDH JOSHIUG Student, School of Electronics Engineering, Vellore Institute of Technology, Chennai, Vandalur - Kelambakkam Road, Chennai, Tamil Nadu - 600127, India.IndiaIndia
SOHIL AGARWALUG Student, School of Mechanical Engineering, Vellore Institute of Technology, Chennai, Vandalur - Kelambakkam Road, Chennai, Tamil Nadu - 600127, India.IndiaIndia
YASHASHVI RAIUG Student, School of Electronics Engineering, Vellore Institute of Technology, Chennai, Vandalur - Kelambakkam Road, Chennai, Tamil Nadu - 600127, India.IndiaIndia

Applicants

NameAddressCountryNationality
VELLORE INSTITUTE OF TECHNOLOGY, CHENNAIVandalur - Kelambakkam Road, Chennai, Tamil Nadu - 600127, India.IndiaIndia

Specification

Description:TECHNICAL FIELD
[0001] The present invention relates to the field of advanced rider assistance systems (ARAS). More particularly, the present invention pertains to a rider assistance system, a rider assistance device, and a method thereof.

BACKGROUND
[0002] The following description of the related art is intended to provide background information pertaining to the field of the present invention. This section may include certain aspects of the art that may be related to various features of the present invention. However, it should be appreciated that this section is used only to enhance the understanding of the reader with respect to the present invention, and not as admissions of the prior art.
[0003] In recent years, there has been a significant increase in the integration of advanced safety features in automotive technology, especially with the advancement of rider assistance systems (hereinafter "RAS", "system" used interchangeably) and communication protocols that enhance safety, control, and overall riding experience. Traditional systems rely primarily on basic alerting mechanisms that are limited to either visual or auditory feedback, with little to no real-time adjustment to varying driving conditions. Moreover, safety systems on two-wheeler vehicles, such as motorcycles, generally lack sophisticated sensor integration, image processing capabilities, and real-time data analysis, which are essential for effective hazard detection and responsive feedback. Motorcycles, given their size, agility, and exposure to surrounding conditions, require a different approach to safety and rider assistance compared to conventional four-wheeled vehicles. This need is further highlighted by the distinct differences between saddle-ride-type vehicles (scooters and some small electric bikes) and straddle-type vehicles (motorcycles and dirt bikes), which experience varied dynamics and have different safety requirements.
[0004] The existing systems such as anti-lock braking systems (ABS) prevent wheel lock-up during braking. The traction control system (TCS): reduces wheel slip during acceleration. Further, blind spot detection (BSD) alerts riders to vehicles in their blind spots. The conventional system also includes sensor-based detection systems such as ultrasonic sensors used for proximity detection and parking assistance. Also, accelerometers that measure acceleration forces to detect sudden movements.
[0005] Most of the current systems are standalone and do not offer comprehensive integration of multiple technologies. Existing telematics and connectivity systems in this domain often lack robust data security measures, making them vulnerable to hacking and unauthorized access. The existing systems primarily offer reactive maintenance alerts rather than predictive capabilities that can forecast and prevent issues before they occur. There is limited capability in the existing rider assistance systems to adapt to varying environmental conditions (e.g., weather changes) to optimize vehicle performance and rider safety. Many existing hazard detection systems for rider assistance do not provide real-time alerts or are limited to specific obstacles, such as only detecting blind spots or road obstacles. The VR training systems specifically implemented for riders are often not integrated with real-world riding data, limiting the effectiveness of training in real-life scenarios. The limited availability of adaptive cruise control and other advanced features for motorcycles, are more commonly found in cars. Lack of advanced V2X communication in the existing systems that enable real-time interaction between the motorcycle, other vehicles, and infrastructure, limiting situational awareness and coordinated safety measures. While significant advancements have been made in rider assistance technologies, current systems are often fragmented and lack the integration necessary to provide a comprehensive safety and performance solution.
[0006] A prior art reference GB 2,610,446 A, titled "Navigation with drivable area detection" discloses an autonomous vehicle control method. It comprises receiving a point cloud from a depth sensor, receiving image data from a camera, predicting at least one label indicating a drivable area by applying machine learning to the image data, labeling the point cloud using at least one label, obtaining odometry information, generating a free region by registering the labeled point cloud and odometry information to a reference coordinate system, and governing a vehicle to drive within the drivable area. A neural network may apply pixel-based segmentation and the point cloud may be projected via pinhole projection. The vehicles or pedestrians may be identified by the method as obstructions.
[0007] Thus, there is a need in the art to provide a rider assistance system, a rider assistance device, and a method thereof.

OBJECTS OF THE PRESENT INVENTION
[0008] Some of the objects of the present invention, which at least one embodiment herein satisfies are as listed herein below.
[0009] It is an object of the present invention to provide a rider assistance system, a rider assistance system to generate one or more recommendations to a user, and a method thereof.
[0010] It is another object of the present invention to provide a rider assistance system (RAS) that enhances the safety and control of vehicles, specifically two-wheeler vehicles such as motorcycles.
[0011] It is another object of the present invention to incorporate multiple sensors and imaging devices into the vehicle to collect data on vehicle parameters and the surrounding environment, facilitating a comprehensive analysis of driving conditions.
[0012] It is another object of the present invention to enable real-time data processing and feature extraction to generate recommendations and alerts that improve rider awareness and decision-making.

SUMMARY:
[0013] Within the scope of this application, it is expressly envisaged that the various aspects, embodiments, examples, and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. Features described in connection with one embodiment are applicable to all embodiments, unless such features are incompatible.
[0014] In an aspect, a rider assistance system (RAS) is implemented to enhance the safety and situational awareness of a rider, specifically on two-wheeler vehicles such as motorcycles. The RAS comprises a vehicle body frame equipped with one or more cameras and sensors that capture real-time data associated with the vehicle's parameters and its surrounding environment. The data is transmitted to a Road Assistance Device (RAD) through a transceiver embedded in the vehicle body frame, allowing remote analysis of the captured data.
[0015] In another aspect, a rider assistance device (hereinafter "RAD" is disclosed. The RAD is equipped with a processor that performs various functions. The processor obtains images from the cameras and retrieves sensed parameters from the sensors. The processor filters the data to remove noise and classify it based on relevance to the driving conditions. Key features such as lane detection, road surface condition, and blind spot detection are extracted to provide meaningful insights. Based on the extracted features, the processor generates one or more recommendations, which may include but not limited to, speed adjustment alerts, lane departure warnings, or collision avoidance alerts. The recommendations are displayed to the user on a display panel mounted on the vehicle.
[0016] In yet another aspect, the present invention discloses the method of generating one or more recommendations to the user using RAD. The method includes a step-wise illustration of how the method is implemented using the RAD to generate customised rider-specific recommendations based on real-time data received.
[0017] Various objects, features, aspects, and advantages of the inventive subject matter will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like components.

BRIEF DESCRIPTION OF THE DRAWINGS
[0018] The accompanying drawings, which are incorporated herein, and constitute a part of this invention, illustrate exemplary embodiments of the disclosed methods and systems which like reference numerals refer to the same parts throughout the different drawings. Components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present invention. Some drawings may indicate the components using block diagrams and may not represent the internal circuitry of each component. It will be appreciated by those skilled in the art that the invention of such drawings includes the invention of electrical components, electronic components or circuitry commonly used to implement such components.
[0019] FIG. 1 illustrates an exemplary block diagram of a rider assistance system (RAS) to generate the recommendations to the user, in accordance with an embodiment of the present invention.
[0020] FIG. 2 illustrates an exemplary flowchart of the system architecture, in accordance with an embodiment of the present invention.
[0021] FIG. 3 illustrates an exemplary flowchart that illustrates the method of generating one or more recommendations to a user using RAS, in accordance with an embodiment of the present invention.
[0022] FIGs. 4 (a-d) illustrates an exemplary overview of flowcharts including sensor data processing flowchart (a), pothole detection and alert system flowchart (b), V2X Communication Flowchart (c), and adaptive cruise control module flowchart (d), in accordance with an embodiment of the present invention.
[0023] FIGs. 5 (a-c) illustrates an exemplary overview of the pothole detection system (a), V2X communication setup (b), and adaptive cruise control module setup (c), in accordance with an embodiment of the present invention.

DETAILED DESCRIPTION
[0024] In the following description, for the purposes of explanation, various specific details are set forth in order to provide a thorough understanding of embodiments of the present invention. It will be apparent, however, that embodiments of the present invention may be practiced without these specific details. Several features described hereafter can each be used independently of one another or with any combination of other features. An individual feature may not address all of the problems discussed above or might address only some of the problems discussed above. Some of the problems discussed above might not be fully addressed by any of the features described herein.
[0025] The ensuing description provides exemplary embodiments only, and is not intended to limit the scope, applicability, or configuration of the invention. Rather, the ensuing description of the exemplary embodiments will provide those skilled in the art with an enabling description for implementing an exemplary embodiment. It should be understood that various changes may be made in the function and arrangement of elements without departing from the spirit and scope of the invention as set forth.
[0026] Various embodiments of the present invention will be explained in detail with respect to FIGs. 1-5.
[0027] FIG. 1 illustrates an exemplary block diagram (150) of a rider assistance system (RAS) (100) to generate recommendations to the user, in accordance with an embodiment of the present invention.
[0028] In a first embodiment of the present invention, the rider assistance system (RAS) is disclosed. The RAS (100) includes the below components.
[0029] The vehicle body frame serves as the structural base for mounting various components of the RAS (100), ensuring secure positioning and optimal functionality. The vehicle body frame (101) houses one or more cameras (102), sensors (103), and the transceiver (104), providing physical stability and integration for data gathering and processing in the RAS (100).
[0030] In an exemplary implementation of the first embodiment, one or more cameras (102) mounted on the vehicle body frame (101) are positioned to capture images of the vehicle and its surrounding environment. In these embodiments, a first camera (102-1) is mounted on the front side of the vehicle body frame (101), and a second camera (102-2) on the rear side of the vehicle body frame. The configuration allows the cameras to capture comprehensive views of the road ahead and behind, contributing to a detailed assessment of the surroundings. The cameras (102) capture images that serve as primary data sources for detecting one or more obstacles associated with road conditions, and nearby vehicles.
[0031] In the exemplary implementation of the first embodiment, the captured images include but are not limited to road conditions, other vehicles, pedestrians, traffic signs, and environmental factors. The cameras may use advanced imaging technology, such as high-definition video capture or night vision capability, to enhance the clarity and accuracy of data captured under varying lighting and weather conditions to detect one or more obstacles.
[0032] In the exemplary implementation of the first embodiment, one or more sensors (103) (hereinafter terms "sensors", and "one or more sensors" used interchangeably) in the RAS (100) are configured to sense parameters associated with both the vehicle (600) and its surroundings. One or more sensors (103) are coupled to the vehicle body frame (101) and the cameras (102), configured to sense a variety of vehicle and surrounding parameters. The sensors (103) play a crucial role in enhancing the accuracy and scope of the system's hazard detection capabilities. The sensors may include but are not limited to, accelerometers, gyroscopes, proximity sensors, Hall effect sensors, radar or lidar sensors, ambient light sensors, ultrasonic sensors, biometric sensors, GPS sensors, environmental sensors, and pressure sensors. Upon the successful detection of obstacles from the images captured by the cameras (102), the sensors (103) measure additional vehicle parameters, such as speed, acceleration, and orientation, along with surrounding parameters like proximity to other vehicles or obstacles.
[0033] In the exemplary implementation of the first embodiment, the sensors (103) may sense parameters including but not limited to, speed, acceleration/deceleration, engine revolutions per minute (RPM), fuel level, battery level, tire pressure, steering angle, and brake status. For instance, a Hall effect sensor can be used to monitor RPM by detecting the magnetic field associated with the engine rotation.
[0034] In the exemplary implementation of the first embodiment, the surrounding environment is assessed through parameters including but not limited to, proximity to other vehicles, lane positioning, pedestrian detection, weather conditions, traffic signs and signals, road surface conditions, ambient light levels, and blind spots. For example, radar sensors can detect the proximity of other vehicles, while ambient light sensors adjust recommendations based on external lighting conditions.
[0035] In the exemplary implementation of the first embodiment, the transceiver (104), is embedded within the vehicle body frame (101). The transmitter (104) is used to transmit the captured images and sensed parameters to a Road Assistance Device (RAD) (200). The transceiver (104) facilitates data transfer and enables real-time communication between the RAS (100) and external devices, which may include cloud servers, other vehicles, or infrastructure nodes for extended data analysis and feedback.
[0036] In the exemplary implementation of the first embodiment, the RAD (200) comprises a processor (201) communicably coupled to sensors (103), and a transceiver (104). The processor (201) executes a set of instructions to process data, classify information, and generate actionable recommendations.
[0037] The processor (201) retrieves sensed parameters from the sensors (103) based on the captured images and detected obstacles. The sensed parameters may include, for example, data related to road conditions, vehicle orientation, and relative distance to nearby objects. The processor (201) processes the retrieved parameters to apply a filtering operation on the data, isolating relevant data points that contribute to an accurate interpretation of the surroundings. This filtering step may involve removing redundant or irrelevant information from the retrieved parameters to ensure that only actionable data is retained for further processing.
[0038] Following the filtering process, the processor (201) classifies the filtered parameters by identifying specific features, such as object types, distances, and movements, to determine the nature of the detected obstacles. For instance, a detected hazard may be classified as a stationary obstacle, moving vehicle, or road surface anomaly. The classification operation enables the processor (201) to extract precise features from the sensed data, allowing for a more targeted analysis of the surrounding environment. The processor (201) then performs an analysis of the extracted features to generate recommendations that aid the user in navigating or responding to the detected obstacles. For example, if an obstacle is detected on the road ahead, the processor (201) may generate a recommendation to adjust speed or change lanes. The recommendations are transmitted to the user through a display panel, audio alert, or other suitable feedback mechanism integrated with the RAS (100).
[0039] In the exemplary implementation of the first embodiment, the communication means (106) enables the RAS (100) to establish connections with external networks, vehicles, and surrounding infrastructure. The communication means (106) may include but not limited to vehicle-to-everything (V2X) communication, GPS, GNSS-based communication, Bluetooth, Wi-Fi, mobile data networks, RFID tags and readers, and DSRC. The communication means (106) allow the RAS (100) to receive and share data related to traffic, road conditions, and other environmental factors, enriching the recommendations generated by the RAD (200).
[0040] In the exemplary implementation of the first embodiment, the augmented reality (AR) display (105) is connected to the processor (201) and positioned either on the vehicle body frame (101) or on a helmet visor (101-1) to display recommendations generated by the RAS (100) directly to the user. The AR display (105) projects real-time information, such as navigation directions, hazard alerts, or speed warnings, within the user's line of sight, improving situational awareness without requiring the user to look away from the road. For instance, when an obstacle is detected on the road, the AR display (105) may show an overlay image of the obstacle with an indication of safe distances or suggest an alternative route. By providing information in a non-intrusive manner, the AR display (105) enhances user experience and promotes safer riding practices.
[0041] In the exemplary implementation of the first embodiment, the virtual reality (VR) module (108) is also connected to the processor (201) and configured to offer one or more training sessions related to vehicle (600) riding techniques to the user. The VR module (108) immerses the user in a simulated environment, replicating real-world riding conditions and challenges. Through these sessions, the user can develop familiarity with advanced riding skills, such as emergency braking, lane changes, and obstacle avoidance, in a controlled, risk-free environment. In some embodiments, the VR module (108) may allow the user to replay previous rides and analyze performance metrics, enabling improvement in riding habits. In an embodiment, the VR module (108) serves as an educational tool, especially beneficial for novice riders or users practicing new riding techniques.
[0042] In the exemplary implementation of the first embodiment, the data storage module (109) is connected to the processor (201) and configured to store captured images, sensed parameters, and generated recommendations. The data storage module (109) utilizes blockchain technology to secure and maintain the integrity of the stored data. By storing data in a decentralized and tamper-resistant format, blockchain technology ensures that the data within the RAS (100) remains reliable and transparent, providing a verifiable record of interactions and recommendations made by the system.
[0043] In the exemplary implementation of the first embodiment, the adaptive cruise control (ACC) module (107) in the RAS (100) is designed to set specific parameters of the vehicle based on recommendations generated by the processor (201). The ACC module (107) is connected to the processor (201), and includes a throttle actuator, brake actuator, or steering actuator. Each actuator is configured to execute control commands in response to real-time data and recommendations, ensuring optimal vehicle performance and safety.
[0044] Working Example 1: The RAS (100) is implemented in a straddle-type motorcycle. A front-mounted camera captures the road view ahead, while rear-mounted sensors detect the proximity of approaching vehicles. As the user rides, the RAD (200) collects data, processes lane markers, and generates lane departure warnings displayed on the screen. The ACC module (107) adjusts the motorcycle's throttle and braking based on the detected traffic density and recommended speed adjustments.
[0045] To summarise, the RAS (100) enhances user awareness and adapting vehicle controls to surrounding conditions, promoting safer and more efficient rides.
[0046] FIG. 2 illustrates an exemplary flowchart of the system architecture (250), in accordance with an embodiment of the present invention.
[0047] In a second embodiment of the present invention, the RAS (100) is designed with a multi-layered architecture that integrates various technologies and components to provide comprehensive rider assistance. The system (100) comprises sensors (103), processor (201), communication means (106), display panel (105), and set of instructions stored in the processor (201) working in harmony.
[0048] In an exemplary implementation of the second embodiment, the sensors (103) including but not limited to, LIDAR (light detection and ranging), ultrasonic sensors, accelerometers, piezoelectric sensors, and cameras.
[0049] In an exemplary implementation of the second embodiment, the processors (201) include a central processing unit (CPU), graphics processing unit (GPU), and artificial intelligence (AI) or machine learning (ML) accelerators. The CPU handles overall system management and data processing. The GPU processes augmented reality (AR)/ virtual reality (VR) graphics and real-time video feed from cameras (102). The AR/VR accelerator is a dedicated hardware for accelerating AI and machine learning algorithms used in predictive maintenance and hazard detection.
[0050] In an exemplary implementation of the second embodiment, the communication means (106) includes but not limited to, V2X (Vehicle-to-Everything) Communication, Bluetooth, and Wi-Fi.
[0051] In an exemplary implementation of the second embodiment, the display panel (105) includes but not limited to an AR heads-up display (HUD), and a VR Module. The AR HUD projects critical information such as navigation prompts, hazard warnings, and speed limits onto a transparent display, ensuring the rider's eyes remain on the road. The VR module provides immersive training simulations for rider education and skill enhancement.
[0052] In an exemplary implementation of the second embodiment, the system (100) works on a combination of various technologies. The ML module analyzes sensor data for predictive maintenance, hazard detection, and rider behavior analysis. The blockchain technology ensures secure and tamper-proof data management and prevents unauthorized access and data hacking.
[0053] FIG. 3 illustrates an exemplary flowchart (300) that illustrates the method (300) of generating one or more recommendations to a user using RAS (100), in accordance with an embodiment of the present invention.
[0054] In a third embodiment of the present invention, the method (300) of generating one or more recommendations to a user using the RAS (100) is disclosed. The method (300) includes the below steps.
[0055] At block 301, the method (300) begins with the capturing (301) of one or more images associated with the vehicle (600) and its surrounding environment. This is achieved by one or more cameras (102) mounted on the vehicle body frame (101). The cameras (102) are strategically positioned to capture a comprehensive view of the surrounding environment, allowing the system to detect any obstacles or potential obstacles on the road.
[0056] At block 302, upon capturing images, one or more sensors (103) sense (302) the vehicle parameters and surrounding parameters in response to the detected obstacles in the environment around the vehicle (600). The sensors (103) are coupled to the vehicle body frame (101) and the cameras (102), enabling the system (100) to react dynamically based on the detected obstacles or obstacles.
[0057] At block 303, after sensing the required parameters, the transceiver (104) transmits (303) the sensed data to the RAD (200) for further analysis. The transceiver (104) allows the RAS (100) to send real-time information regarding detected obstacles and surrounding conditions to the RAD (200).
[0058] At block 304, the processor (201) of the RAD (200) retrieves (304) the sensed parameters from the transceiver (104). The retrieval process focuses on parameters related to the identified obstacles captured in the images.
[0059] At block 305, the processor (201) then processes the retrieved parameters to filter out irrelevant data, ensuring that only pertinent information is utilized in further analysis. The filtering process involves eliminating noise and unwanted information, thereby refining the data to include only elements that are relevant to the detected obstacles.
[0060] At block 306, once the parameters have been filtered, the processor (201) classifies (306) the filtered parameters to extract specific features relevant to generating recommendations. The classification of data involves organizing the data based on predefined categories, which may include types of obstacles or traffic conditions.
[0061] At block 307, the processor (201) analyzes the classified features to generate recommendations based on the identified obstacles and surrounding conditions. The analysis involves assessing the extracted features to determine the appropriate recommendation to the user.
[0062] To summarise, the method (300) involves the interaction of various components mounted on a vehicle body frame (101), including one or more cameras (102), sensors (103), and a transceiver (104), which work collectively with a processor (201) in a rider assistance device (RAD) (200).
[0063] FIGs. 4 (a-d) illustrates an exemplary overview of flowcharts (450) including sensor data processing flowchart (a), pothole detection and alert system flowchart (b), V2X Communication Flowchart (c), and adaptive cruise control module flowchart (d), in accordance with an embodiment of the present invention.
[0064] Referring to FIG. 4a, the flowchart represents the stages of sensor data processing, from collection to decision-making.
[0065] Referring to FIG. 4b, the flowchart represents how the system (100) detects potholes using sensors (103) and issues alerts or actions.
[0066] Referring to FIG. 4c, the motorcycle V2X unit transmits and receives data related to the vehicle's surroundings. The infrastructure transceivers (104) such as traffic lights, road signs, etc., that interact with the vehicle (600). Other Vehicles' V2X Units includes other vehicles equipped with V2X communication to share real-time data.
[0067] Further, referring to FIG. 4d, the adaptive cruise control module (107) flowchart will outline how the adaptive cruise control module (107) works, adjusts speed based on sensor inputs.
[0068] FIGs. 5 (a-c) illustrates an exemplary overview (550) of the pothole detection system (a), V2X communication setup (b), and adaptive cruise control module setup (c), in accordance with an embodiment of the present invention.
[0069] Referring to FIG. 5a, the pothole detection system includes piezoelectric sensors, data processor, alert system, and v2x transmitter.
[0070] Referring to FIG. 5b, the V2X communication setup includes a Motorcycle V2X unit, External vehicle/infrastructure transceivers.
[0071] Referring to FIG. 5c, the adaptive cruise control module setup includes speed/distance sensors, processor (201), and throttle/brake actuators. The sensors (103) positioned at the front of the motorcycle i.e. vehicle (600), are connected to the processor (201), which in turn is connected to the throttle/brake system.
[0072] In a fourth embodiment of the present invention, the broad workable ranges for various parameters of the RAS (100) are disclosed.
[0073] In an exemplary implementation of the fourth embodiment, the rider assistance system (100) integrates a range of advanced technologies to enhance safety and functionality. V2X communication supports data exchange within a 50 to 500-meter range at speeds of 1 to 10 Mbps, enabling vehicle-to-everything interactions. The AR navigation system offers a 20 to 50-degree field of view and brightness between 500 and 1000 nits, updating navigation at frequencies from 1 to 10 Hz for optimal visibility. The ACC module (107) manages speeds between 20 and 150 km/h and maintains a safe following distance from 1 to 100 meters. For pothole detection, piezoelectric sensors detect pressure changes from 0.1 to 5 bar, identifying potholes between 10 and 50 cm in diameter.
[0074] In the exemplary implementation of the fourth embodiment, ultrasonic sensors support blind spot monitoring within a 0.5 to 2-meter range, operating at 40 to 100 kHz. Blockchain technology manages secure data with block sizes of 1 to 10 MB, processed every 10 seconds to 1 minute. Cloud-sourced data integration updates in real-time at 1 to 5 Hz and stores data from 10 to 100 GB. A high field-of-view rear-view camera captures images across 120 to 170 degrees in 720p to 1080p HD resolution. The adaptive environmental response system adjusts settings based on temperatures from -20°C to 70°C, modifying suspension stiffness, engine response, and traction control. Hazard detection uses LIDAR for distances of 10 to 200 meters with ±0.1-meter accuracy. Accelerometers measure forces up to ±16 g with a 12-bit to 16-bit resolution. Lastly, sensor integration operates on a 12V to 24V power supply, consuming between 10W to 50W.
[0075] In a fifth embodiment of the present invention, the rider assistance system (RAS) (100) includes various monitoring and sensing technologies for enhanced performance and safety. The tire pressure monitoring system (TPMS) checks tire pressure from 30 to 50 psi with ±1 psi accuracy, updating every 1 to 5 seconds. Accelerometers and gyroscopes measure acceleration and rotational speed within ranges of ±2g to ±16g and ±250°/s to ±2000°/s, respectively, updating at frequencies between 50 to 500 Hz. Hall effect sensors track wheel speed from 0 to 10,000 rpm with ±0.1% accuracy, updating 10 to 1000 times per second. Ultrasonic sensors detect objects within 0.2 to 4 meters with ±1 cm accuracy, updating every 50 to 200 milliseconds, while LIDAR sensors cover distances from 0.1 to 100 meters with ±2 cm precision, refreshing at 10 to 1000 Hz. Biometric sensors monitor heart rate between 30 to 200 bpm, updating every 1 to 5 seconds with ±1 bpm accuracy. Finally, environmental sensors capture temperature (-40°C to +85°C) and humidity (0% to 100%) with accuracies of ±0.5°C and ±3%, updating every 1 to 10 seconds.
[0076] In a sixth embodiment of the present invention, the broad and workable ranges for machine learning and AI parameters, AR and VR parameters, and blockchain parameters are illustrated. The machine learning and AI components operate with data input frequencies ranging from 1 to 1000 Hz, depending on sensor type, and require large datasets-often in gigabytes or terabytes-for model training. Real-time processing targets latencies under 100 milliseconds, with predictive models aiming for 85% to 95% accuracy in tasks like hazard detection. The augmented reality (AR) and virtual reality (VR) systems offer fields of view from 20° to 40° for AR and 90° to 120° for VR, with resolutions of 800 x 600 to 4320 x 2160 pixels and refresh rates between 60 and 120 Hz for seamless visuals. The blockchain parameters include block sizes of 1 to 2 MB, transaction throughput of 10 to 1000 transactions per second, and confirmation times from 1 to 10 seconds.
[0077] While considerable emphasis has been placed herein on the preferred embodiments, it will be appreciated that many embodiments can be made and that many changes can be made in the preferred embodiments without departing from the principles of the invention. It is to be distinctly understood that the foregoing descriptive matter is to be implemented merely as illustrative of the invention and not as a limitation.

ADVANTAGES OF THE PRESENT INVENTION
[0078] The present invention provides a rider assistance system (RAS), a rider assistance device (RAD) to generate recommendations to a user and a method thereof.
[0079] The present invention provides a rider assistance system that lowers accident rates and leads to reduced insurance premiums for riders, provides financial benefits, and further encourages the adoption of the technology.
[0080] The present invention provides a rider assistance system that incorporates a predictive maintenance algorithm that can anticipate and address issues before they become critical, leading to lower repair costs and extending the lifespan of components.
[0081] The present invention provides a rider assistance system that offers additional services such as real-time traffic data, advanced navigation, and personalized riding analytics through subscription models which can generate recurring revenue.
[0082] The present invention provides a rider assistance system that can create significant value, drive revenue growth, and establish a strong position in the motorcycle technology market.

, Claims:1. A rider assistance system (RAS) (100) to generate one or more recommendations to a user, the RAS (100) comprising:
a vehicle body frame (101) having a transceiver (104);
one or more cameras (102) coupled to the vehicle body frame (101), the one or more cameras (102) are configured to capture one or more images associated with a vehicle (600) and a surrounding environment of the vehicle (600);
one or more sensors (103) coupled to the vehicle body frame (101) and the one or more cameras (102), the one or more sensors (103) are configured to sense one or more vehicle parameters and one or more surrounding parameters upon successfully detecting one or more obstacles based on the one or more captured images; and
a rider assistance device (RAD) (200) having a processor (201), the processor (201) communicably coupled to the one or more sensors (103), and the transceiver (104), wherein the processor (201) is configured to:
retrieve the one or more sensed parameters;
process the one or more retrieved parameters to filter the one or more processed parameters;
classify the one or more filtered parameters to extract one or more features; and
analyse the one or more extracted features to generate one or more recommendations to the user.
2. The RAS (100) as claimed in claim 1, wherein the vehicle (600) is a saddle-ride type vehicle or a straddle-type vehicle.
3. The RAS (100) as claimed in claim 1, wherein the RAS (100) further comprising:
a transceiver (104) adapted to transmit the captured images and the one or more sensed parameters to the RAD (200), wherein the transceiver (104) is embedded in the vehicle body frame (101) or adapted to be communicably coupled to the processor (201);
a communication means (106) configured to enable communication between the vehicle (600), the surrounding environment of the vehicle (600), and the RAD (200), wherein the communication means (106) are selected from any or a combination of vehicle-to-everything (V2X) communication, GPS and global navigation satellite system (GNSS)-Based communication, Bluetooth, Wi-Fi communication, mobile data networks, radio-frequency identification (RFID) tags and readers, or dedicated short-range communication (DSRC);
an adaptive cruise control (ACC) module (107) operatively coupled to the processor (201), the ACC module (107) is configured to set the one or more vehicle parameters based on the one or more generated recommendations, wherein the ACC module (107), and the ACC module (107) comprises one or more actuators (107-1) selected from any or a combination of throttle actuator, brake actuator, or steering actuator;
an augmented reality (AR) display (105) operatively coupled to the processor (201) and positioned on the vehicle body frame (101) or on a helmet visor (101-1), the AR display (105) is configured to display the one or more generated recommendations to the user;
a virtual reality (VR) module (108) operatively coupled to the processor (201), the VR module (108) is configured to provide one or more training sessions associated with riding the vehicle (600) to the user; and
a data storage module (109) operatively coupled to the processor (201), the data storage module (109) is configured to store the one or more captured images, the one or more sensed parameters, and the one or more generated recommendations through a blockchain technology.
4. The RAS (100) as claimed in claim 1, wherein:
the one or more cameras (102) comprises a first camera (102-1) being positioned on a front side of the vehicle body frame (101) and a second camera (102-2) being positioned on a rear side of the vehicle body frame (101); and
the processor (201) is selected from any or a combination of an artificial intelligence (AI) accelerator, a machine learning (ML) accelerator, a graphics processing unit (GPU), a field-programmable gate array (FPGA), a digital signal processor (DSP), a tensor processing unit (TPU), a neuromorphic chip, or an application-specific integrated circuit (ASIC).
5. The RAS (100) as claimed in claim 1, wherein the RAS (100) comprising:
the one or more sensors (103) are selected from any or a combination of accelerometer, gyroscope, proximity sensors, hall effect sensors, radar or lidar sensors, ambient light sensor, ultrasonic sensors, biometric sensors, global positioning system (GPS) sensor, environmental sensors or pressure sensor, wherein at least one pressure sensor being positioned in at least one tier of the vehicle;
the one or more vehicle parameters are selected from any or a combination of speed, acceleration/deceleration, engine revolutions per minute (RPM), fuel level, battery level, tire pressure, steering angle, or brake status;
the one or more surrounding parameters are selected from any or a combination of proximity to other vehicles, lane positioning, pedestrian detection, weather conditions, traffic signs and signals, road surface conditions, ambient light levels, or blind spots; and
the one or more obstacles are selected from any or a combination of physical road obstacles includes potholes, debris, speed bumps, or other vehicles, environmental condition includes wet or icy patches, low visibility areas, traffic signals and road sign include stop signs or traffic lights, or pedestrian crossings.
6. The RAS (100) as claimed in claim 1, wherein the RAS (100) comprising:
the one or more features are selected from any or a combination of lane detection, distance to leading vehicle, road surface condition detection, traffic sign recognition, blind spot detection, vehicle speed and acceleration monitoring, collision warning, turn and curve detection, traffic density estimation, pedestrian detection, or object detection; and
the one or more recommendations comprises one or more suggestions or one or more alerts, wherein the one or more recommendations are selected from any or a combination of speed adjustment recommendations, lane departure warning, collision avoidance alert, blind spot warning, curve speed warning, traffic congestion alert, weather condition recommendations, pedestrian warning, safe following distance, fatigue alert, obstacle detection alert, or turn signal reminder.
7. A rider assistance device (RAD) (200) to generate one or more recommendations to a user, the RAD (200) comprising:
a processor (201) is configured to:
retrieve one or more parameters, the one or more parameters comprises one or more vehicle parameters and one or more surrounding parameters;
process the one or more retrieved parameters to filter the one or more processed parameters;
classify the one or more filtered parameters to extract one or more features; and
analyse the one or more extracted features to generate one or more recommendations to the user.
8. The RAD (200) as claimed in claim 7, wherein:
a vehicle body frame (101);
one or more cameras (102) coupled to the vehicle body frame (101), the one or more cameras (102) are configured to capture one or more images associated with a vehicle (600) and a surrounding environment of the vehicle (600); and
one or more sensors (103) coupled to the vehicle body frame (101) and the one or more cameras (102), the one or more sensors (103) are configured to sense the one or more parameters, the one or more parameters comprises one or more vehicle parameters and one or more surrounding parameters upon successfully detecting one or more obstacles based on the one or more captured images; and
a transceiver (104) is embedded in the vehicle body frame (101) or adapted to be communicably coupled to the processor (201), the transceiver (104) is adapted to transmit the captured images and the one or more sensed parameters to the road assistance device (RAD).
9. The RAD (200) as claimed in claim 7, wherein:
the one or more vehicle parameters are selected from any or a combination of speed, acceleration/deceleration, engine revolutions per minute (RPM), fuel level, battery level, tire pressure, steering angle, or brake status; and
the one or more surrounding parameters are selected from any or a combination of proximity to other vehicles, lane positioning, pedestrian detection, weather conditions, traffic signs and signals, road surface conditions, ambient light levels, or blind spots;
10. A method (300) for generating one or more recommendations to a user, the method (300) comprising:
capturing (301), by one or more cameras (102), one or more images associated with a vehicle (600) and a surrounding environment of the vehicle (600), wherein the one or more cameras (102) are coupled to a vehicle body frame (101);
sensing (302), by one or more sensors (103), one or more vehicle parameters and one or more surrounding parameters upon successful capturing of one or more obstacles from the surrounding environment of the vehicle (600), wherein the one or more sensors (103) are coupled to the vehicle body frame (101) and the one or more cameras (102);
transmitting (303), by a transceiver (104), the sensed parameters to a rider assistance device (RAD) (200), the transceiver (104) is embedded in the vehicle body frame (101) or adapted to be communicably coupled to a processor (201) of the RAD (200);
retrieving (304), by the processor (201), the one or more sensed parameters based on the one or more captured obstacles;
processing (305), by the processor (201), the one or more retrieved parameters to filter the one or more processed parameters;
classifying (306), by the processor (201), the one or more filtered parameters to extract one or more features; and
analyzing (307), by the processor (201), the one or more extracted features to generate one or more recommendations to the user.

Documents

NameDate
202441090277-FORM-8 [25-11-2024(online)].pdf25/11/2024
202441090277-COMPLETE SPECIFICATION [20-11-2024(online)].pdf20/11/2024
202441090277-DECLARATION OF INVENTORSHIP (FORM 5) [20-11-2024(online)].pdf20/11/2024
202441090277-DRAWINGS [20-11-2024(online)].pdf20/11/2024
202441090277-EDUCATIONAL INSTITUTION(S) [20-11-2024(online)].pdf20/11/2024
202441090277-EVIDENCE FOR REGISTRATION UNDER SSI [20-11-2024(online)].pdf20/11/2024
202441090277-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [20-11-2024(online)].pdf20/11/2024
202441090277-FORM 1 [20-11-2024(online)].pdf20/11/2024
202441090277-FORM 18 [20-11-2024(online)].pdf20/11/2024
202441090277-FORM FOR SMALL ENTITY(FORM-28) [20-11-2024(online)].pdf20/11/2024
202441090277-FORM-9 [20-11-2024(online)].pdf20/11/2024
202441090277-POWER OF AUTHORITY [20-11-2024(online)].pdf20/11/2024
202441090277-REQUEST FOR EARLY PUBLICATION(FORM-9) [20-11-2024(online)].pdf20/11/2024
202441090277-REQUEST FOR EXAMINATION (FORM-18) [20-11-2024(online)].pdf20/11/2024

footer-service

By continuing past this page, you agree to our Terms of Service,Cookie PolicyPrivacy Policy  and  Refund Policy  © - Uber9 Business Process Services Private Limited. All rights reserved.

Uber9 Business Process Services Private Limited, CIN - U74900TN2014PTC098414, GSTIN - 33AABCU7650C1ZM, Registered Office Address - F-97, Newry Shreya Apartments Anna Nagar East, Chennai, Tamil Nadu 600102, India.

Please note that we are a facilitating platform enabling access to reliable professionals. We are not a law firm and do not provide legal services ourselves. The information on this website is for the purpose of knowledge only and should not be relied upon as legal advice or opinion.