Consult an Expert
Trademark
Design Registration
Consult an Expert
Trademark
Copyright
Patent
Infringement
Design Registration
More
Consult an Expert
Consult an Expert
Trademark
Design Registration
Login
AUTONOMOUS NIGHT PATROLLING ROBOT FOR CRIME MONITORING AND LIVE STREAMING SYSTEM AND METHOD EMPLOYED THEREOF
Extensive patent search conducted by a registered patent agent
Patent search done by experts in under 48hrs
₹999
₹399
Abstract
Information
Inventors
Applicants
Specification
Documents
ORDINARY APPLICATION
Published
Filed on 14 November 2024
Abstract
Exemplary embodiments of the present disclosure are directed towards an autonomous night patrolling robot for crime monitoring and live streaming systems and the method employed thereof. The system comprises a 2D LiDAR sensor positioned at the top of the robot. Equipped with the integration of SLAM techniques with the 2D LiDAR sensor, the robot effectively navigates confined industrial spaces, overcoming GPS limitations. Integrated with ROS2, the system intelligently navigates, avoids obstacles, and optimizes routes. The Smart Localizer transforms industrial operations by integrating cutting-edge technology, enhancing efficiency and safety. An ability to set multiple navigation targets, Goal Pose 1, Goal Pose 2, and Goal Pose 3, and guide the robot to reach these destinations resulting in it also categorizing dangerous regions based on severity and sending real-time alert notifications to relevant emergency service providers, including police stations and hospitals, with precise location details during an incident. Fig. 1A-Fig.1F.
Patent Information
Application ID | 202441088188 |
Invention Field | ELECTRONICS |
Date of Application | 14/11/2024 |
Publication Number | 47/2024 |
Inventors
Name | Address | Country | Nationality |
---|---|---|---|
Dr B Sivaramakrishna | Associate Professor, Department of Computer Science and Engineering, Lakireddy Bali Reddy College of Engineering(A), Mylavaram, NTR Dt, Andhrapradesh, 521230 | India | India |
Mr. G. V. Suresh | Associate Professor, Department of Computer Science and Engineering, Lakireddy Bali Reddy College of Engineering(A), Mylavaram, NTR Dt, Andhrapradesh, 521230 | India | India |
Bandaru Gracy | Student, Department of Computer Science and Engineering, Lakireddy Bali Reddy College of Engineering(A), Mylavaram, NTR Dt, Andhrapradesh, 521230 | India | India |
Dr. D. Veeraiah | Professor, Department of Computer Science and Engineering, Lakireddy Bali Reddy College of Engineering(A), Mylavaram, NTR Dt, Andhrapradesh, 521230 | India | India |
Dr. K. Devi Priya | Associate Professor, Department of Computer Science and Engineering, Lakireddy Bali Reddy College of Engineering(A), Mylavaram, NTR Dt, AP, 521230 | India | India |
Dr. D. Venkata Subbaiah | Professor, Department of Computer Science and Engineering, Lakireddy Bali Reddy College of Engineering(A), Mylavaram, NTR Dt, Andhrapradesh, 521230 | India | India |
Sunkara Joshith Sai Ram | Student, Department of Computer Science and Engineering,Lakireddy Bali Reddy College of Engineering(A), Mylavaram, NTR Dt,Andhrapradesh, 521230 | India | India |
Applicants
Name | Address | Country | Nationality |
---|---|---|---|
LAKIREDDY BALI REDDY COLLEGE OF ENGINEERING | L.B.Reddy Nagar, Mylavaram - 521230, Andhra Pradesh, India. | India | India |
Specification
Description:TECHNICAL FIELD
[001] The present disclosed subject matter relates to a patrolling robot with significant potential in the field of security. More particularly, the present disclosure relates to an autonomous night patrolling robot for crime monitoring and a live streaming system and method employed thereof.
BACKGROUND
[002] In both rural and urban environments, ensuring security and safety during nighttime hours poses a significant challenge. Traditional surveillance methods typically rely on human intervention, which can be inefficient, costly, and sometimes hazardous. There is an urgent need for an autonomous system that can effectively patrol designated areas, detect suspicious activities or weapons, and swiftly relay critical information to law enforcement authorities.
[003] The Robot session is an IoT-based patrolling robot with great potential in the field of security. Designed to serve as a security guard in various locations, including offices, homes, and warehouses, it is equipped with a camera to capture images and videos of its surroundings. Operated via an Arduino board, the robot can navigate using different communication methods such as Bluetooth, command-line inputs, hand gestures, and Wi-Fi. The live recordings captured by the robot's camera are stored on a server, which users can access remotely from any device with server access.
[004] The system also integrates gas sensors to detect hazardous gases. If poisonous gases are present and their concentration exceeds a certain threshold, the system evaluates the level of threat. In case of a gas leak, it identifies danger zones based on wind direction and speed. The system promptly broadcasts an alert message to mobile devices in the affected area and activates alarms, including flashing lights, to warn the population. The robot operates within a predefined map and requires a constant internet connection. Existing systems, however, face limitations in navigating from one location to another while avoiding obstacles.
[005] In the light of aforementioned discussion, there exists a need for an autonomous night patrolling robot for crime monitoring and a live streaming system and method that would overcome or ameliorate the above-mentioned limitations.
SUMMARY
[006] The following presents a simplified summary of the disclosure in order to provide a basic understanding of the reader. This summary is not an extensive overview of the disclosure and it does not identify key/critical elements of the invention or delineate the scope of the invention. Its sole purpose is to present some concepts disclosed herein in a simplified form as a prelude to the more detailed description that is presented later.
[007] Exemplary embodiments of the present disclosure are directed towards an autonomous night patrolling robot for crime monitoring, a live streaming system, and the method employed thereof..
[008] An objective of the present disclosure is directed towards providing an autonomous night patrolling robot equipped with SLAM and 2D LiDAR sensors, capable of effectively navigating confined industrial spaces. Enhanced self-localization ensures precise mapping and navigation, while autonomous operation boosts efficiency and safety.
[009] Another objective of the present disclosure is directed toward enhancing efficiency and safety through autonomous operation, eliminating the need for human intervention. Integrated with ROS2, the robot intelligently navigates, avoids obstacles, and optimizes routes.
[0010] Another objective of the present disclosure is directed toward addressing current challenges in industrial robotics, advancing automation, and promising increased efficiency and safety.
[0011] Another objective of the present disclosure is directed towards is to integrate the system with ROS2 for intelligent navigation, obstacle avoidance, and route optimization, revolutionizing industrial operations through advanced technology.
[0012] Another objective of the present disclosure is for the Smart Localizer to transform industrial operations by incorporating advanced technology, ensuring enhanced efficiency, safety, and paving the way for further advancements in autonomous industrial robotics.
[0013] Another objective of the present disclosure is directed towards is to provide a reliable and effective solution for night patrolling.
[0014] Another objective of the present disclosure is directed towards is to reduce the manpower required for night patrolling and effectively lower crime rates.
[0015] Another objective of the present disclosure is directed towards is to revolutionize night patrolling by detecting suspicious activities and weapons and relaying information to the nearest police station.
[0016] According to an exemplary embodiment of the present disclosure, the Autonomous night patrolling robot for crime monitoring and a live streaming system includes a 2D LiDAR sensor is positioned at the top of the robot, whereby the 2D LiDAR sensor is configured to detect suspicious activities such as weapon detection and suspicious sounds, enhancing security.
[0017] According to an exemplary embodiment of the present disclosure, the Autonomous night patrolling robot for crime monitoring and a live streaming system further includes at least one laser emitter is configured to measure distances and create a detailed 3D map of the surroundings.
[0018] According to an exemplary embodiment of the present disclosure, the Autonomous night patrolling robot for crime monitoring and a live streaming system further includes an integration of Simultaneous Localization and Mapping (SLAM) techniques with 2D LiDAR sensor, enabling effective navigation in confined industrial spaces, overcoming GPS limitations, and ensuring precise mapping and self-localization.
[0019] According to an exemplary embodiment of the present disclosure, the Autonomous night patrolling robot for crime monitoring and a live streaming system further includes a Seamless integration with Robot Operating System 2 (ROS2) for intelligent navigation, real-time decision-making, and obstacle avoidance, optimizing routes and revolutionizing industrial operations with advanced technology.
[0020] According to an exemplary embodiment of the present disclosure, the Autonomous night patrolling robot for crime monitoring and a live streaming system further includes a 3D representation of the robot's environment in the Gazebo, displaying obstacles as circular shapes, showcasing the practical application of localization data to determine the robot's position, whereby the system is configured to use sensor data and localization algorithms to guide the robot along a safe path, effectively maneuvering in real-world scenarios, and this visualization highlights the role of localization and obstacle avoidance in autonomous robot navigation, critical for tasks such as exploration and logistics.
[0021] According to an exemplary embodiment of the present disclosure, the Autonomous night patrolling robot for crime monitoring and a live streaming system further includes an ability to set multiple navigation targets, such as Goal Pose 1, Goal Pose 2, and Goal Pose 3, and guide the robot to reach these destinations resulting in it also categorizing dangerous regions based on severity and sending real-time alert notifications to relevant emergency service providers, including police stations and hospitals, with precise location details during an incident.
BRIEF DESCRIPTION OF THE DRAWINGS
[0022] In the following, numerous specific details are set forth to provide a thorough description of various embodiments. Certain embodiments may be practiced without these specific details or with some variations in detail. In some instances, certain features are described in less detail so as not to obscure other aspects. The level of detail associated with each of the elements or features should not be construed to qualify the novelty or importance of one feature over the others.
[0023] FIG. 1A is a diagram depicting a schematic representation of Smart Localizer, showing the structural configuration of the robot, according to exemplary embodiments of the present disclosure.
[0024] FIG. 1B is a diagram depicting a 3D representation of the robot's environment in Gazebo, according to exemplary embodiments of the present disclosure.
[0025] FIG. 1C is a diagram depicting a generated map using a Slam Toolbox, according to exemplary embodiments of the present disclosure.
[0026] FIG. 1D - FIG. 1F is a diagram depicting an ability to set multiple navigation targets, such as Goal Pose 1, Goal Pose 2, and Goal Pose 3, and guide the robot to reach these destinations, according to exemplary embodiments of the present disclosure.
[0027] FIG. 2 is a flow diagram depicting a method for crime monitoring and live streaming using an autonomous night patrolling robot, according to exemplary embodiments of the present disclosure.
DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS
[0028] It is to be understood that the present disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the drawings. The present disclosure is capable of other embodiments and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting.
[0029] The use of "including", "comprising" or "having" and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. The terms "a" and "an" herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced item. Further, the use of terms "first", "second", and "third", and so forth, herein do not denote any order, quantity, or importance, but rather are used to distinguish one element from another.
[0030] Referring to FIG. 1A is a diagram 100a, depicting a schematic representation of Smart Localizer, showing the structural configuration of the robot, according to exemplary embodiments of the present disclosure. The robot 102 comprises a 2D LiDAR sensor 104 and robot wheels 106a-106b. the 2D LiDAR sensor 104 is positioned at the top of the robot 102 and serves as a critical component for environment perception. The 2D LiDAR sensor 104 is configured to detect suspicious activities such as weapon detection and suspicious sounds, enhancing security. A laser emitter (Not shown in FIG. 1B) is configured to measure distances and create a detailed 3D map of the surroundings. The localizer's components, including processors, sensors, and communication modules, seamlessly integrate into the robot's structure 102. Using advanced algorithms, the LiDAR data allows the robot to accurately localize itself within its environment, enabling precise navigation and effective obstacle avoidance. This integration of LiDAR technology enables dynamic, real-time navigation, ensuring the robot's adaptability and safety in complex environments.
[0031] Referring to FIG. 1B is a diagram 100b, depicting a 3D representation of the robot's environment in Gazebo, according to exemplary embodiments of the present disclosure. The 3D representation of the robot's environment in the Gazebo, displaying obstacles as circular shapes 108a-108b, showcasing the practical application of localization data to determine the robot's position. The system is configured to use sensor data and localization algorithms to guide the robot 102 along a safe path, effectively maneuvering in real-world scenarios, and this visualization highlights the role of localization and obstacle avoidance in autonomous robot navigation, critical for tasks such as exploration and logistics.
[0032] According to an exemplary embodiment of the present disclosure, the generated map using a Simultaneous Localization and Mapping (SLAM) Toolbox 100c (As shown in FIG. 1C). The integration of Simultaneous Localization and Mapping (SLAM) techniques 100c with 2D LiDAR sensor 104, enabling effective navigation in confined industrial spaces 110a-110g, overcoming GPS limitations, and ensuring precise mapping and self-localization. The LiDAR technology 104 allows the robot to navigate dynamically in real time, enhancing adaptability and safety in complex environments while improving self-localization for accurate mapping and navigation.
[0033] Referring to FIG. 1D - FIG. 1F is a diagram 100d-100f, depicting an ability to set multiple navigation targets, such as Goal Pose 1, Goal Pose 2, and Goal Pose 3 and guide the robot to reach these destinations, according to exemplary embodiments of the present disclosure. An ability to set multiple navigation targets, such as Goal Pose 1 (112), Goal Pose 2 (114), and Goal Pose 3 (116), and guide the robot to reach these destinations. Guiding the robot 102 to successfully reach its assigned destinations, including Goal Pose 1 (112a), Goal Pose 2 (114a), and Goal Pose 3 (116a). A Seamless integration with Robot Operating System 2 (ROS2) for intelligent navigation, real-time decision-making, and obstacle avoidance, optimizing routes and revolutionizing industrial operations with advanced technology resulting in it also categorizing dangerous regions based on severity and sending real-time alert notifications to relevant emergency service providers, including police stations, hospitals and rescue teams, with precise location details during an incident. Triggers alarms with visual flashing lights and auditory signals in affected areas.
[0034] The robot 102, currently limited in its ability to navigate between locations while overcoming obstacles, uses a centralized network for communication and data storage, and the device incorporates point-to-point navigation to reach destinations without deviation. The smart localizer enhances night patrolling by detecting suspicious activities and weapons and relaying this information to the nearest police station. The precise mapping and navigation enable the robot to operate autonomously, improving efficiency and safety in industrial environments.
[0035] Referring to FIG. 2 is a flow diagram 200, depicting a method for crime monitoring and live streaming using an autonomous night patrolling robot, according to exemplary embodiments of the present disclosure. The method starts at step 202 by integrating simultaneous Localization and Mapping (SLAM) techniques using multiple 2D LiDAR sensors for navigation in confined spaces. The method continues to the next step 204 by utilizing laser pulses and odometry data from the 2D LiDAR sensors to enable accurate mapping and navigation through the SLAM toolbox. The method continues to the next step 206 by generating a comprehensive 3D map of the environment to aid in precise robot navigation and situational awareness. The method continues to the next step 208 by setting multiple navigation targets, such as Goal Pose 1, Goal Pose 2, and Goal Pose 3, to guide the robot toward specific destinations.
[0036] As shown in FIG. 2, the method continues to the next step 210 by guiding the robot to successfully reach its assigned destinations, including Goal Pose 1, Goal Pose 2, and Goal Pose 3. The method continues to the next step 212 by improving the robot's self-localization capabilities to ensure accurate mapping and reliable navigation within its environment. The method continues to the next step 214 by detecting the robot's autonomy to identify suspicious behaviors or weapons through advanced detection systems. The method continues to the next step 216 by incorporating cutting-edge technologies within ROS2 to optimize intelligent navigation, obstacle avoidance, and route planning. The method continues to the next step 218 by sending alert notifications to relevant emergency service providers, such as police stations and hospitals, with precise location details during an incident.
[0037] Reference throughout this specification to "one embodiment", "an embodiment", or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Thus, appearances of the phrases "in one embodiment", "in an embodiment" and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
[0038] Furthermore, the described features, structures, or characteristics of the disclosure may be combined in any suitable manner in one or more embodiments. In the above description, numerous specific details are provided such as examples of programming, software modules, user selections, network transactions, database queries, database structures, hardware modules, hardware circuits, hardware chips, etc., to provide a thorough understanding of embodiments of the disclosure.
[0039] Although the present disclosure has been described in terms of certain preferred embodiments and illustrations thereof, other embodiments and modifications to preferred embodiments may be possible that are within the principles and spirit of the invention. The above descriptions and figures are therefore to be regarded as illustrative and not restrictive.
[0040] Thus the scope of the present disclosure is defined by the appended claims and includes both combinations and sub-combinations of the various features described hereinabove as well as variations and modifications thereof, which would occur to persons skilled in the art upon reading the foregoing description.
, Claims:We Claim:
1. Autonomous night patrolling robot for crime monitoring and a live streaming system, comprising:
A 2D LiDAR sensor is positioned at the top of the robot, whereby the 2D LiDAR sensor is configured to detect suspicious activities such as weapon detection and suspicious sounds, enhancing security;
At least one laser emitter is configured to measure distances and create a detailed 3D map of the surroundings;
An integration of Simultaneous Localization and Mapping (SLAM) techniques with 2D LiDAR sensor, enabling effective navigation in confined industrial spaces, overcoming GPS limitations, and ensuring precise mapping and self-localization;
A Seamless integration with Robot Operating System 2 (ROS2) for intelligent navigation, real-time decision-making, and obstacle avoidance, optimizing routes and revolutionizing industrial operations with advanced technology;
A 3D representation of the robot's environment in the Gazebo, displaying obstacles as circular shapes, showcasing the practical application of localization data to determine the robot's position, whereby the system is configured to use sensor data and localization algorithms to guide the robot along a safe path, effectively maneuvering in real-world scenarios, and this visualization highlights the role of localization and obstacle avoidance in autonomous robot navigation, critical for tasks such as exploration and logistics; and
An ability to set multiple navigation targets, such as Goal Pose 1, Goal Pose 2, and Goal Pose 3, and guide the robot to reach these destinations resulting in it also categorizing dangerous regions based on severity and sending real-time alert notifications to relevant emergency service providers, including police stations and hospitals, with precise location details during an incident.
2. The system as claimed in claim 1, wherein the smart localizer's components, including processors, sensors, and communication modules, are seamlessly integrated within the robot's structure.
3. The system as claimed in claim 1, wherein the cutting-edge technologies within ROS2 are integrated for intelligent navigation, obstacle avoidance, and route optimization.
4. The system as claimed in claim 1, the integration of LiDAR technology allows the robot to navigate dynamically in real-time, ensuring adaptability and safety in complex environments, and it enhances self-localization for precise mapping and navigation.
5. The system as claimed in claim 1, wherein the robot, currently limited in its ability to navigate between locations while overcoming obstacles, uses a centralized network for communication and data storage, and the device incorporates point-to-point navigation to reach destinations without deviation.
6. The system as claimed in claim 1, wherein the smart localizer enhances night patrolling by detecting suspicious activities and weapons and relaying this information to the nearest police station.
7. The system as claimed in claim 1, wherein precise mapping and navigation enable the robot to operate autonomously, improving efficiency and safety in industrial environments.
8. A method for crime monitoring and live streaming using an autonomous night patrolling robot, comprising:
integrating simultaneous Localization and Mapping (SLAM) techniques using multiple 2D LiDAR sensors for navigation in confined spaces;
utilizing laser pulses and odometry data from the 2D LiDAR sensors to enable accurate mapping and navigation through the SLAM toolbox;
generating a comprehensive 3D map of the environment to aid in precise robot navigation and situational awareness;
setting multiple navigation targets, such as Goal Pose 1, Goal Pose 2, and Goal Pose 3, to guide the robot toward specific destinations;
guiding the robot to successfully reach its assigned destinations, including Goal Pose 1, Goal Pose 2, and Goal Pose 3;
improving the robot's self-localization capabilities to ensure accurate mapping and reliable navigation within its environment;
detecting the robot's autonomy to identify suspicious behaviors or weapons through advanced detection systems;
incorporating cutting-edge technologies within ROS2 to optimize intelligent navigation, obstacle avoidance, and route planning; and
sending alert notifications to relevant emergency service providers, such as police stations and hospitals, with precise location details during an incident.
Documents
Name | Date |
---|---|
202441088188-FORM 18 [18-11-2024(online)].pdf | 18/11/2024 |
202441088188-COMPLETE SPECIFICATION [14-11-2024(online)].pdf | 14/11/2024 |
202441088188-DECLARATION OF INVENTORSHIP (FORM 5) [14-11-2024(online)].pdf | 14/11/2024 |
202441088188-DRAWINGS [14-11-2024(online)].pdf | 14/11/2024 |
202441088188-EDUCATIONAL INSTITUTION(S) [14-11-2024(online)].pdf | 14/11/2024 |
202441088188-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [14-11-2024(online)].pdf | 14/11/2024 |
202441088188-FORM 1 [14-11-2024(online)].pdf | 14/11/2024 |
202441088188-FORM FOR SMALL ENTITY [14-11-2024(online)].pdf | 14/11/2024 |
202441088188-FORM FOR SMALL ENTITY(FORM-28) [14-11-2024(online)].pdf | 14/11/2024 |
202441088188-FORM-9 [14-11-2024(online)].pdf | 14/11/2024 |
202441088188-POWER OF AUTHORITY [14-11-2024(online)].pdf | 14/11/2024 |
202441088188-REQUEST FOR EARLY PUBLICATION(FORM-9) [14-11-2024(online)].pdf | 14/11/2024 |
Talk To Experts
Calculators
Downloads
By continuing past this page, you agree to our Terms of Service,, Cookie Policy, Privacy Policy and Refund Policy © - Uber9 Business Process Services Private Limited. All rights reserved.
Uber9 Business Process Services Private Limited, CIN - U74900TN2014PTC098414, GSTIN - 33AABCU7650C1ZM, Registered Office Address - F-97, Newry Shreya Apartments Anna Nagar East, Chennai, Tamil Nadu 600102, India.
Please note that we are a facilitating platform enabling access to reliable professionals. We are not a law firm and do not provide legal services ourselves. The information on this website is for the purpose of knowledge only and should not be relied upon as legal advice or opinion.