Consult an Expert
Trademark
Design Registration
Consult an Expert
Trademark
Copyright
Patent
Infringement
Design Registration
More
Consult an Expert
Consult an Expert
Trademark
Design Registration
Login
Advanced Blind Spot Detection System for Enhanced Vehicle Safety in Assisted and Autonomous Driving
Extensive patent search conducted by a registered patent agent
Patent search done by experts in under 48hrs
₹999
₹399
Abstract
Information
Inventors
Applicants
Specification
Documents
ORDINARY APPLICATION
Published
Filed on 26 October 2024
Abstract
Existing blind spot detection systems primarily focus on assisting ego vehicles during lane changes by identifying objects in their blind spots; however, they fail to address the risk posed when the ego vehicle is in the blind spot of other vehicles. Similarly, current free-space detection systems do not incorporate the blind spot regions of adjacent vehicles as part of their inputs, thereby limiting the overall safety of driving systems. The proposed method introduces a novel approach that uses sensor fusion of Camera and Radar inputs to detect and identify vehicles in front of the ego vehicle while also estimating the blind spot regions of these vehicles. This process involves the identification of vehicle mirrors or the approximation of mirror positions through the detection of A/B pillars using camera inputs. The Radar-generated distance map assists in this calculation by localizing blind spots relative to the position of the vehicle. Based on the type of detected vehicle—whether a car or truck—the system calculates the blind spot regions: on both sides for cars, and additionally at the rear for trucks with trailers. In autonomous driving, it further enhances decision-making by dynamically positioning the ego vehicle outside the blind spots of other vehicles, ensuring safer navigation. These blind spot data can be utilized in both assisted and autonomous driving systems. In assisted driving, the system alerts the driver when the ego vehicle is within another vehicle's blind spot, thereby preventing hazardous situations. In assisted driving, the system alerts the driver when the Ego vehicle is within another vehicle's blind spot, preventing hazardous situations. In autonomous driving, it further enhances decision-making by dynamically positioning the ego vehicle outside of other vehicles' blind spots, ensuring safer navigation. This system also aids in lane change decisions, improving overall road safety for both driving modes. The proposed solution aims to significantly reduce the risks associated with blind spot incidents, thereby contributing to safer roadways and a more robust assisted and autonomous driving ecosystem.
Patent Information
Application ID | 202441081729 |
Invention Field | ELECTRONICS |
Date of Application | 26/10/2024 |
Publication Number | 44/2024 |
Inventors
Name | Address | Country | Nationality |
---|---|---|---|
Dr. Adarsh S J | Christ College of Engineering Irinjalakuda Kerala 680125 | India | India |
Dr. Alosh James | Christ College of Engineering Irinjalakuda Kerala 680125 | India | India |
Dr. Sijo M T | Christ College of Engineering Irinjalakuda Kerala 680125 | India | India |
Mr. Nithin V. K. | Christ College of Engineering Irinjalakuda Kerala 680125 | India | India |
Applicants
Name | Address | Country | Nationality |
---|---|---|---|
Dr. Adarsh S J | Christ College of Engineering Irinjalakuda Kerala 680125 | India | India |
Dr. Alosh James | Christ College of Engineering Irinjalakuda Kerala 680125 | India | India |
Dr. Sijo M T | Christ College of Engineering Irinjalakuda Kerala 680125 | India | India |
Mr. Nithin V. K. | Christ College of Engineering Irinjalakuda Kerala 680125 | India | India |
Specification
Description:The patent describes a system that enhances blind spot detection by considering not only the blind spot of the Ego vehicle (the vehicle that the system is installed in) but also the blind spot regions of other vehicles in the surrounding environment. The following are the major components of this proposed system:
1. Input Sensors:
Camera: The primary sensor capturing images of vehicles in front of the Ego vehicle.
Radar: Provides distance and positional data of detected vehicles, enhancing depth perception and helping build a 3D representation of the scene.
Sensor Fusion: Camera and radar inputs are fused to create a 3D model of the environment, allowing for accurate detection and localization of vehicles, their mirrors, and blind spot regions.
2. Object Detection Module:
Vehicle Detection: A deep learning model trained to detect vehicles (cars and trucks) in front of the Ego vehicle.
Vehicle Localization: Utilizes radar data to estimate the distance and position of the detected vehicles on the road.
3D Scene Modeling: Fusing camera and radar data to represent the environment, estimate free space, and determine the position of objects around the Ego vehicle.
3. Mirror and Pillar Detection:
Wing Mirror Detection: Once a vehicle is detected, machine learning or deep learning techniques are used to identify the positions of wing mirrors, which are crucial for determining the blind spot regions.
A/B Pillar Detection: Detecting vehicle pillars (A and B pillars) helps approximate mirror locations, especially in cases where the mirrors are not directly visible.
4. Blind Spot Region Estimation:
3D Blind Spot Estimation: Based on the detected mirror and A/B pillar positions, the blind spot regions for the other vehicles (left, right, and rear for trucks) are estimated in the 3D world which is shown in figure 2.
Schematic Depiction: Figures are used to illustrate how the blind spot regions are mapped out using the inputs from the sensors and detected mirror/pillar positions.
5. Decision Making Module:
Driver Alerts: The system can alert the Ego vehicle's driver if they are within the blind spot region of another vehicle, helping to prevent accidents.
Lane Changing Assistance: The system can inform the Ego vehicle about the presence of the vehicle in another vehicle's blind spot using vehicle-to-vehicle (V2V) communication, making lane changing safer.
Free Space Detection: The system enhances free space detection by providing additional information on areas that are outside of blind spot regions of surrounding vehicles.
Autonomous Driving Positioning: In autonomous driving scenarios, the Ego vehicle can adjust its position to avoid being in the blind spot of other vehicles, reducing the risk of collisions.
6. Blind Spot Information Processing:
Blind Spot Calculation: The system calculates blind spots based on the type of vehicle (car or truck), taking into account the left, right, and rear sides of trucks as shown in figure 3.
Integration in Assisted and Autonomous Driving: The blind spot data is used for decision-making in both assisted driving (driver alerting and lane change assistance) and autonomous driving (self-positioning outside blind spots).
This approach provides a more comprehensive method for blind spot detection, helping prevent accidents by considering the blind spots of both the Ego vehicle and surrounding vehicles.
, Claims:1. A method for detecting blind spots of adjacent vehicles, comprising sensor inputs from a camera and radar, wherein the camera detects and identifies vehicles in front of an ego vehicle, provides distance and position information, thereby enabling the estimation of blind spot regions of the detected vehicles.
2. The camera and radar data are fused to create a three-dimensional (3D) representation of the environment in front of the ego vehicle, which includes the identification of both free-space and blind-spot regions of adjacent vehicles.
3. The method of further detecting the positions of the mirrors or approximating the positions of the A/B pillars of the identified vehicles using the camera, wherein the mirror or pillar positions are used to estimate the blind spot regions in the 3D representation.
4. For trucks or trailers, the blind spot regions included the rear side, in addition to the left and right sides, and the blind spot regions were estimated based on the type of vehicle identified. For cars, the blind spot regions are estimated on both the left and right sides, and for trucks or trailers, the blind spot regions include the rear side in addition to the left and right sides.
5. A system for assisting lane changes and driving safety comprises a decision-making module that processes the blind spot region information from other vehicles and alerts the ego vehicle driver if the vehicle is within the blind spot of adjacent vehicles to prevent collisions.
6. The system in which the decision-making module utilizes vehicle-to-vehicle (V2V) communication to alert adjacent vehicles if the ego vehicle is present within their blind spot assists in safe lane changes.
7. The method wherein the free-space detection system is enhanced by considering the non-blind spot regions of adjacent vehicles, thereby improving the accuracy of detecting drivable space in front of the ego vehicle.
8. In the case of autonomous driving, the system positions the ego vehicle outside the blind spot regions of adjacent vehicles based on the estimated blind spot and free-space information, thereby reducing the risk of accidents.
9. The method further comprises the use of deep learning or machine learning models trained on camera images to accurately detect the position of vehicle mirrors or A/B pillars, thereby enhancing the precision of blind spot region estimation.
10. A vehicle safety system is capable of continuously monitoring and updating the position of adjacent vehicles and their blind spot regions in real time, providing ongoing feedback to both assisted and autonomous driving systems.
Documents
Name | Date |
---|---|
202441081729-COMPLETE SPECIFICATION [26-10-2024(online)].pdf | 26/10/2024 |
202441081729-DRAWINGS [26-10-2024(online)].pdf | 26/10/2024 |
202441081729-FORM 1 [26-10-2024(online)].pdf | 26/10/2024 |
Talk To Experts
Calculators
Downloads
By continuing past this page, you agree to our Terms of Service,, Cookie Policy, Privacy Policy and Refund Policy © - Uber9 Business Process Services Private Limited. All rights reserved.
Uber9 Business Process Services Private Limited, CIN - U74900TN2014PTC098414, GSTIN - 33AABCU7650C1ZM, Registered Office Address - F-97, Newry Shreya Apartments Anna Nagar East, Chennai, Tamil Nadu 600102, India.
Please note that we are a facilitating platform enabling access to reliable professionals. We are not a law firm and do not provide legal services ourselves. The information on this website is for the purpose of knowledge only and should not be relied upon as legal advice or opinion.