Consult an Expert
Trademark
Design Registration
Consult an Expert
Trademark
Copyright
Patent
Infringement
Design Registration
More
Consult an Expert
Consult an Expert
Trademark
Design Registration
Login
OVER THE AIR SURVEILLANCE AND SECURITY SYSTEM AND METHOD THEREOF
Extensive patent search conducted by a registered patent agent
Patent search done by experts in under 48hrs
₹999
₹399
Abstract
Information
Inventors
Applicants
Specification
Documents
ORDINARY APPLICATION
Published
Filed on 23 November 2024
Abstract
OVER THE AIR SURVEILLANCE AND SECURITY SYSTEM AND METHOD THEREOF ABSTRACT An over the air surveillance and security system (100) is disclosed. The system (100) comprising: an unmanned air vehicle (102). The unmanned air vehicle (102) comprising propellers (106a-106d) driven using motors (108a-108d). A camera (112) adapted to capture visuals of a person of interest while on surveillance. A control unit (118) is configured to: receive a SOS data packet from a user device (124), wherein the SOS data packet comprises a destination location of the person of interest; map a route from a source location to the destination location; activate the electronic speed controller (110) to actuate the propellers (106a-106d); actuate a flight controller (120) to initiate a flight of the unmanned air vehicle (102) on the mapped route; and enable the camera (112) for capturing/recording of the person of interest at the destination location. Claims: 10, Figures: 9 Figure 1A is selected.
Patent Information
Application ID | 202441091303 |
Invention Field | ELECTRONICS |
Date of Application | 23/11/2024 |
Publication Number | 48/2024 |
Inventors
Name | Address | Country | Nationality |
---|---|---|---|
Dr. Rangarao Orugu | 6-28A, Musunuru (Post & Mandal) Eluru District, Andhra Pradesh. 52120 | India | India |
Sai Suraj Sarakanam | Near Bus Stand ,Chinna Hirajana Peta, Attili,Attili Mandal ,534134,Andhra Pradesh | India | India |
Hari Teja Somarouthu | 8-213/2, Beside Road Of Bank Of Baroda, Angalakuduru, Tenali, Guntur, Andhra Pradesh -522211 | India | India |
Seelaboina Sai venkat | 31-25/2, Near B.C Community Hall, Cherukuwada, Penugonda, West Godavari, Andhra Pradesh - 534 320 | India | India |
Talabathula V V S S S Jaya Surya | 14-1-42, Vangalavari Street, Near Kotagummam Centre, Pithapuram, East Godavari- 533450 | India | India |
Vinoothna Netala | Ashok Towers, 70-2-148/A/C4 Road Number 2, Ramanayyapeta, East Godavari Kakinada, Andhra Pradesh 533005 | India | India |
Ragolu Pallavi | 16-257/A, Near Girls High School Rajula Colony, Jangareddygudem, Eluru, Andhra Pradesh -534447 | India | India |
Ramireddy Manikanta Reddy | 38/175-30Sri Janardhan Sai Nagar Ramanjaneyapuram,Near Hanuman Temple And Kia Motors Showroom, Kadapa,Andhra Pradesh - 516002 | India | India |
Sripathi Sitha Rama Swamy | 6-152/1A Near Sai Baba Temple,Dwarakatirumala Andhra Pradesh -534426 | India | India |
Pathiwada Rohith Kumar | 4-35/2 Gollalakoderu, Near Ramalayam,Andhra Pradesh - 534202 | India | India |
Potta J N V V S M Vinay Kumar | 9-92, Near Ramalayam, Mahadevapatnam, Undi Mandal, West Godavari - 534 199 | India | India |
Shaik .Ameena Begam | 12-37,Near Water Tank , Kotthuru, Kamavarapukota Mandal,West Godavari,Andhra Pradesh -534449 | India | India |
Maddimsetti Narasimha Murthy | 2-33 Lakshmi Narasimha Nagar Mro Office, Korukonda | India | India |
Applicants
Name | Address | Country | Nationality |
---|---|---|---|
Vishnu Institute of Technology | Vishnu Institute of Technology, Vishnupur, Bhimavaram Andhra Pradesh India 534202 deanrnd@vishnu.edu.in 8309117085 | India | India |
Gaganyan Aerospace LLP | 6-28A, Musunuru (post & mandal) Eluru District, Andhra Pradesh. 521207 | India | India |
Specification
Description:BACKGROUND
Field of Invention
[001] Embodiments of the present invention generally relate to a drone and particularly to an over the air surveillance and security system.
Description of Related Art
[002] Personal safety, particularly in public spaces, has been a long-standing concern across the globe. Over the years, various personal safety measures have been developed to address this issue. These include mobile safety applications that allow users to send Save Our Soul (SOS) alerts, real-time location sharing, and emergency contact notifications. Additionally, personal safety devices such as alarms, pepper sprays, and wearable safety gadgets have been introduced to empower individuals to protect themselves in threatening situations.
[003] Mobile applications like bSafe, SafeTrek, and Noonlight offer emergency features, enabling users to alert authorities or trusted contacts when they feel unsafe. Similarly, wearable devices like smart jewelry and wristbands equipped with panic buttons and GPS tracking provide discreet ways to seek help during emergencies. Other solutions, such as community safety platforms like Nextdoor's "Neighbors" feature, facilitate the reporting of incidents and request assistance from nearby members.
[004] Moreover, public safety initiatives, including the installation of emergency call boxes and increased police patrols, have been implemented in various cities to enhance safety in public spaces. Despite these efforts, many existing solutions have limitations. Several existing solutions often rely on individual action, which places the burden of safety on the user. Additionally, issues such as limited accessibility, privacy concerns, and inadequate law enforcement integration have hindered the effectiveness of these solutions in fully addressing safety needs of the individuals.
[005] Cultural and social barriers also play a significant role in the effectiveness of safety measures, as societal attitudes towards gender-based violence and stigma can prevent individuals from seeking help. Consequently, there remains a need for more comprehensive and integrated approaches to safety of the individuals that can address both immediate threats and the underlying systemic issues.
[006] There is thus a need for an improved and surveillance and security system that can administer the aforementioned limitations in a more efficient manner.
SUMMARY
[007] Embodiments in accordance with the present invention provide an over the air surveillance and security system. The system comprising: an unmanned air vehicle. The unmanned air vehicle comprising: a frame. The frame comprising: propellers adapted to create a thrust to lift the unmanned air vehicle. The propellers are driven using motors, such that rotations of the motors are controlled using an electronic speed controller. The frame further comprising: a camera adapted to capture a visuals a person of interest while on surveillance. The camera is rotatable at an angle of 90 degrees. The frame further comprising: a siren adapted to emit sound waves for alerting the person of interest. The frame further comprising: a control unit communicatively connected to the electronic speed controller and the camera. The control unit is configured to: receive a Save Our Soul (SOS) data packet from a user device. The Save Our Soul (SOS) data packet comprises a destination location of the person of interest. The control unit is further configured to: map a route from a source location to the destination location; activate the electronic speed controller to initiate a power delivery to the motors, such that the motors are adapted to actuate the propellers; actuate a flight controller to initiate a fly of the unmanned air vehicle from the source location to the destination location on the mapped route; and enable the camera for capturing the visuals of the person of interest at the destination location.
[008] Embodiments in accordance with the present invention further provide a method for method for conducting an over the air surveillance and security. The method comprising steps of: receiving a Save Our Soul (SOS) data packet from a user device. The Save Our Soul (SOS) data packet comprises a destination location of a person of interest; mapping a route from a source location to the destination location; activating an electronic speed controller to initiate a power delivery to motors, such that motors are adapted to actuate propellers; actuating a flight controller to initiate a fly of an unmanned air vehicle from the source location to the destination location on the mapped route; and enabling a camera for capturing visuals of the person of interest at the destination location.
[009] Embodiments of the present invention may provide a number of advantages depending on their particular configuration. First, embodiments of the present application may provide an over the air surveillance and security system.
[0010] Next, embodiments of the present application may provide an over the air surveillance and security system that provides an instant response to threats, arriving quickly at the scene when a Save Our Soul (SOS) signal is activated, bypassing delays associated with manual intervention or waiting for emergency services.
[0011] Next, embodiments of the present application may provide an over the air surveillance and security system that can swiftly reach the location of the person in distress, potentially reducing the response time significantly compared to traditional emergency services.
[0012] Next, embodiments of the present application may provide an over the air surveillance and security system that offers real-time situational awareness to authorities, enabling them to assess the situation accurately and respond more effectively.
[0013] Next, embodiments of the present application may provide an over the air surveillance and security system that alerts nearby authorities and prompts residents to come to the aid of the person in distress, fostering community involvement.
[0014] Next, embodiments of the present application may provide an over the air surveillance and security system that creates a comprehensive system for proactive intervention in women's safety, going beyond individual safety devices or mobile apps.
[0015] Next, embodiments of the present application may provide an over the air surveillance and security system that serves as visible deterrents to potential harassers, potentially preventing the escalation of threatening situations.
[0016] Next, embodiments of the present application may provide an over the air surveillance and security system that is scalable and can be deployed in various environments, such as public spaces, campuses, and urban areas, allowing for wide-reaching application.
[0017] Next, embodiments of the present application may provide an over the air surveillance and security system that leverages advanced technology and encourages community engagement, the drone enhances safety in public spaces, addressing common concerns like street harassment and violence against women.
[0018] Next, embodiments of the present application may provide an over the air surveillance and security system that operates independently once the Save Our Soul (SOS) signal is triggered, reducing the burden on the person in distress.
[0019] Next, embodiments of the present application may provide an over the air surveillance and security system that can be integrated into broader safety networks, providing coverage across multiple locations and offering a more holistic approach to women's safety.
[0020] These and other advantages will be apparent from the present application of the embodiments described herein.
[0021] The preceding is a simplified summary to provide an understanding of some embodiments of the present invention. This summary is neither an extensive nor exhaustive overview of the present invention and its various embodiments. The summary presents selected concepts of the embodiments of the present invention in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other embodiments of the present invention are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
BRIEF DESCRIPTION OF THE DRAWINGS
[0022] The above and still further features and advantages of embodiments of the present invention will become apparent upon consideration of the following detailed description of embodiments thereof, especially when taken in conjunction with the accompanying drawings, and wherein:
[0023] FIG. 1A illustrates a block diagram of an over the air surveillance and security system, according to an embodiment of the present invention;
[0024] FIG. 1B illustrates a top view of an unmanned air vehicle, according to an embodiment of the present invention;
[0025] FIG. 1C illustrates a front view of the unmanned air vehicle, according to an embodiment of the present invention;
[0026] FIG. 1D illustrates a back view of the unmanned air vehicle, according to an embodiment of the present invention;
[0027] FIG. 1E illustrates a right view of the unmanned air vehicle, according to an embodiment of the present invention;
[0028] FIG. 1F illustrates a left view of the unmanned air vehicle, according to an embodiment of the present invention;
[0029] FIG. 1G illustrates a bottom view of the unmanned air vehicle, according to an embodiment of the present invention;
[0030] FIG. 2 illustrates a block diagram of a control unit of the over the air surveillance and security system, according to an embodiment of the present invention; and
[0031] FIG. 3 depicts a flowchart of a method for conducting an over the air surveillance and security using the over the air surveillance and security system, according to an embodiment of the present invention.
[0032] The headings used herein are for organizational purposes only and are not meant to be used to limit the scope of the description or the claims. As used throughout this application, the word "may" is used in a permissive sense (i.e., meaning having the potential to), rather than the mandatory sense (i.e., meaning must). Similarly, the words "include", "including", and "includes" mean including but not limited to. To facilitate understanding, like reference numerals have been used, where possible, to designate like elements common to the figures. Optional portions of the figures may be illustrated using dashed or dotted lines, unless the context of usage indicates otherwise.
DETAILED DESCRIPTION
[0033] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the scope of the invention as defined in the claims.
[0034] In any embodiment described herein, the open-ended terms "comprising", "comprises", and the like (which are synonymous with "including", "having" and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of", "consists essentially of", and the like or the respective closed phrases "consisting of", "consists of", the like.
[0035] As used herein, the singular forms "a", "an", and "the" designate both the singular and the plural, unless expressly stated to designate the singular only.
[0036] FIG. 1A illustrates a block diagram of an over the air surveillance and security system 100 (hereinafter referred to as the system 100), according to an embodiment of the present invention. In an embodiment of the present invention, the system 100 may comprise an unmanned air vehicle 102. The unmanned air vehicle 102 may be dispatched by the system 100 at a destination location upon receipt of a Save Our Soul (SOS) data packet from a person of interest. The system 100 may be capable of a continuous monitoring and surveillance over the person of interest aerially. The system 100 may further be configured to audibly alert community corresponding to the person of interest for intervention in distress times. In an embodiment of the present invention, the person of interest may be a woman, a child, an elderly individual, or any other individual whose safety or wellbeing requires specific attention.
[0037] In an embodiment of the present invention, the unmanned air vehicle 102 may be, but not limited to, a Single-engine land (SEL), a Multi-engine land (MEL), a Single-engine Sea (SES), a Multi-engine sea (MES), a Helicopter, a Gyroplane, a Tiltrotor, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the unmanned air vehicle 102, including known, related art, and/or later developed technologies.
[0038] According to an embodiment of the present invention, the unmanned air vehicle 102 may comprise a frame 104. The frame 104 may be adapted to provide an integral strength to the unmanned air vehicle 102. In an embodiment of the present invention, a dimension of the frame 104 may be in a range from 400 millimetres (mm) to 500 millimetres (mm). In a preferred embodiment of the present invention, the dimension of the frame 104 may be 450 millimetres (mm). Embodiments of the present invention are intended to include or otherwise cover any dimension of the frame 104.
[0039] Further, the frame 104 may provide a housing to components of the unmanned air vehicle 102. The components of the unmanned air vehicle 102 housed in the frame 104 may be, but not limited to, propellers 106a-106d, motors 108a-108d, an electronic speed controller 110, a camera 112, a relay 114, a siren 116, a control unit 118, a flight controller 120, and a power supply unit 122.
[0040] In an embodiment of the present invention, the propellers 106a-106d may be adapted to create a thrust to lift the unmanned air vehicle 102. In an embodiment of the present invention, the propellers 106a-106d may further be explained in conjunction with FIG. 1B.
[0041] In an embodiment of the present invention, the motors 108a-108d may be adapted to drive the propellers 106a-106d. The motors 108a-108d may rotate the propellers 106a-106d at a high speed. The high-speed rotation of the propellers 106a-106d by the motors 108a-108d may create the thrust that may in return lift the unmanned air vehicle 102. The motors 108a-108d may be diagonally arranged on the frame 104 of the unmanned air vehicle 102. Further, an each of motor from the motors 108a-108d may be individually connected to an each of propeller in the propellers 106a-106d. Moreover, the diagonal arrangement of the propellers 106a-106d and the motors 108a-108d may equibalance the frame 104. Therefore, equibalancing the unmanned air vehicle 102.
[0042] In an embodiment of the present invention, a power intensity of the motors 108a-108d may be in a range from 900 Kilovolts (KV) to 1000 Kilovolts (KV). In a preferred embodiment of the present invention, the power intensity of the motors 108a-108d may be 935 Kilovolts (KV). Embodiments of the present invention are intended to include or otherwise cover any power intensity of the motors 108a-108d.
[0043] In an embodiment of the present invention, the electronic speed controller 110 may be adapted to control rotations of the motors 108a-108d. The manipulation in the rotations of the motors 108a-108d may further in turn control the rotations of the propellers 106a-106d. The manipulation in the rotations of the motors 108a-108d and the rotations of the propellers 106a-106d may finely control a speed, a velocity, a turning radius, a rotation of the unmanned air vehicle 102.
[0044] In an embodiment of the present invention, the electronic speed controller 110 may be, but not limited to, a brushless electronic speed controller, a brushed electronic speed controller, and so forth. In a preferred embodiment of the present invention, the electronic speed controller 110 may be a biheli30A speed controller. Embodiments of the present invention are intended to include or otherwise cover any type of the electronic speed controller 110, including known, related art, and/or later developed technologies.
[0045] In an embodiment of the present invention, the camera 112 may be adapted to capture visuals the person of interest while on surveillance at the destination location. In an embodiment of the present invention, the camera 112 may further be explained in conjunction with FIG. 1C.
[0046] In an embodiment of the present invention, the relay 114 may be adapted to actuate the siren 116. The siren 116 may emit sound waves to audibly alert the person of interest and persons in a proximity of the person of interest to intervene and help the person of interest in the distressed times.
[0047] The relay 114 may actuate the siren 116 when the unmanned air vehicle 102 may have arrived at the destination location, in an embodiment of the present invention. In another embodiment of the present invention, the relay 114 may actuate the siren 116 when the camera 112 may capture or record the visuals of the person of interest while on surveillance.
[0048] According to the other embodiments of the present invention, the relay 114 may be, but not limited to, a coaxial relay, a contactor relay, a force-guided relay, a latching relay, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the relay 114, including known, related art, and/or later developed technologies.
[0049] According to the other embodiments of the present invention, the siren 116 may be, but not limited to, a buzzer, a beeper, a sound unit, a speaker, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the siren 116, including known, related art, and/or later developed technologies.
[0050] In an embodiment of the present invention, the control unit 118 may be connected to the electronic speed controller 110 and the camera 112. The control unit 118 may further be configured to execute computer-executable instructions to generate an output relating to the system 100.
[0051] According to embodiments of the present invention, the control unit 118 may be, but not limited to, a Programmable Logic Control (PLC) unit, a microprocessor, a development board, and so forth. In a preferred embodiment of the present invention, the control unit 118 may be a Raspberry Pi 4B+. Embodiments of the present invention are intended to include or otherwise cover any type of the control unit 118, including known, related art, and/or later developed technologies. In an embodiment of the present invention, the control unit 118 may further be explained in conjunction with FIG. 2.
[0052] In an embodiment of the present invention, the flight controller 120 may be connected to the electronic speed controller 110. The flight controller 120 may be adapted to fly the unmanned air vehicle 102 on a route mapped by the control unit 118. Further, the flight controller 120 may manage the unmanned air vehicle 102 flight dynamics and may integrate various sensors and inputs. The various sensors integrated by the flight controller 120 may install additional facilities in the unmanned air vehicle 102. The facilities installed by the flight controller 120 may be, but not limited to, an obstacle detection, a route remembrance, a return-to-home, a self-charging, an aerially-track the person of interest, and so forth.
[0053] According to embodiments of the present invention, the flight controller 120 may be, but not limited to, a flyer, a DJI drone control engine, and so forth. In a preferred embodiment of the present invention, the flight controller 120 may be a Pixhawk Cube Orange+. Embodiments of the present invention are intended to include or otherwise cover any type of the flight controller 120, including known, related art, and/or later developed technologies.
[0054] In an embodiment of the present invention, the power supply unit 122 may supply an operational power to the control unit 118. The power supply unit 122 may further supply operational power to the motors 108a-108d, the electronic speed controller 110, the camera 112, and the flight controller 120, in an embodiment of the present invention.
[0055] The power supply unit 122 may comprise a power module. The power module may be configured to distribute power to the components of the unmanned air vehicle 102. The power module may further be configured to regulate a voltage distribution across the components of the unmanned air vehicle 102. The power supply unit 122 may further comprise a Power Distribution Board. The Power Distribution Board may be adapted to ensure that the power supplied by the power supply unit 122 may be evenly distributed across the components of the unmanned air vehicle 102.
[0056] In an exemplary embodiment of the present invention, the power supply unit 122 may provide power from a battery. In an embodiment of the present invention, the battery power supply may be from a rechargeable battery. In another embodiment of the present invention, the battery power supply may be from a non-rechargeable battery. According to embodiments of the present invention, the battery for power supply may be of any composition such as, but not limited to, a Nickel-Cadmium battery, a Nickel-Metal Hydride battery, a Zinc-Carbon battery, a Lithium-Ion battery, and so forth. In a preferred embodiment of the present invention, the power supply unit 122 may have a capacity of 3s 5000 mAh (milli-Ampere hours). Embodiments of the present invention are intended to include or otherwise cover any composition of the battery, including known, related art, and/or later developed technologies.
[0057] In an embodiment of the present invention, a user device 124 may be an electronic device used by the person of interest. The user device 124 may be adapted to enable the person of interest to press an Save Our Soul (SOS) button in/on the user device 124. The Save Our Soul (SOS) button may be a physical button that may be tactically mounted on the user device 124, in an embodiment of the present invention. In another embodiment of the present invention, the Save Our Soul (SOS) button may be a virtual button that may be electronically enabled by a software installed in the user device 124. Further, upon pressing the Save Our Soul (SOS) button the user device 124 may transmit the Save Our Soul (SOS) data packet to the control unit 118. The Save Our Soul (SOS) data packet may comprise information such as, but not limited to, a distress signal, the destination location, and so forth.
[0058] According to embodiments of the present invention, the user device 124 may be, but not limited to, an electronic wearable device, a mobile phone, a smart phone, a tablet, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the user device 124, including known, related art, and/or later developed technologies.
[0059] In an embodiment of the present invention, a communication unit 126 may enable the user device 124 and the control unit 118 to communicate. The communication may be facilitated using the communication unit 126 by generation and establishment of a communication link, in an embodiment of the present invention. The communication link established by the communication unit 126 may enable a transmission of the Save Our Soul (SOS) data packet from the user device 124 to the unmanned air vehicle 102. According to embodiments of the present invention, the communication unit 126 may be, but not limited to, a Wi-Fi communication unit, a Bluetooth communication unit, a millimeter waves communication unit, an Ultra-High Frequency (UHF) communication unit, and so forth. In a preferred embodiment of the present invention, the communication unit 126 may be a Global System for Mobile Communication (GSM) Subscriber Identity Module (SIM) 800L. Embodiments of the present invention are intended to include or otherwise cover any type of the communication unit 126, including known, related art, and/or later developed technologies.
[0060] FIG. 1B illustrates a top view of the unmanned air vehicle 102, according to an embodiment of the present invention. The propellers 106a-106d may be diagonally arranged on the frame 104 of the unmanned air vehicle 102. Moreover, the diagonal arrangement of the propellers 106a-106d and the motors 108a-108d may equibalance the frame 104. Therefore, equibalancing the unmanned air vehicle 102. In an embodiment of the present invention, a dimension of the propellers 106a-106d may be in a range from 9 inches (in) to 11 inches (in). In a preferred embodiment of the present invention, the dimension of the propellers 106a-106d may be 10 inches (in). Embodiments of the present invention are intended to include or otherwise cover any dimension of the propellers 106a-106d.
[0061] FIG. 1C illustrates a front view of the unmanned air vehicle 102, according to an embodiment of the present invention. In an embodiment of the present invention, the camera 112 may be adapted to rotate at an angle of 90 degrees. In another embodiment of the present invention, the camera 112 may be adapted to rotate at an angle of 180 degrees. In yet another embodiment of the present invention, the camera 112 may be adapted to rotate at an angle of 360 degrees. In a further embodiment of the present invention, the camera 112 may be adapted to rotate at any angle of degrees.
[0062] In an embodiment of the present invention, the camera 112 may be configured to capture the visuals such as images of the person of interest while on surveillance. The camera 112 may further be configured to record the visuals such as videos of the person of interest or a surrounding while on surveillance, in an embodiment of the present invention. In an exemplary embodiment of the present invention, the predefined duration of the recorded video clips may be 2 seconds. In another exemplary embodiment of the present invention, the predefined duration of the recorded video clips may be 4 seconds. In yet another embodiment of the present invention, the video clips may be of any duration such as defined by a system administrator.
[0063] The camera 112 may also be configured to transmit the captured images and/or videos of the person of interest to law enforcement bodies, in an embodiment of the present invention. In an embodiment of the present invention, the law enforcement bodies may be configured for a continuous monitoring of the captured images and/or videos of the person of interest. In an embodiment of the present invention, the law enforcement bodies may be automated using a computer system. In another embodiment of the present invention, a manual monitoring of the captured images and/or videos of the person of interest may be done by the system administrator.
[0064] According to other embodiments of the present invention, a resolution for the captured images and/or videos of the person of interest using the camera 112 may be in a range from 320 pixels by 240 pixels to 1920 pixels by 1080 pixels. Embodiments of the present invention are intended to include or otherwise cover any resolution for the captured images and/or videos of the person of interest using the camera 112, including known, related art, and/or later developed technologies.
[0065] According to the other embodiments of the present invention, the camera 112 may be, but not limited to, a still camera, a video camera, a color balancer camera, a thermal camera, an infrared camera, a telephoto camera, a wide-angle camera, a macro camera, a Close-Circuit Television (CCTV) camera, a web camera, and so forth. In a preferred embodiment of the present invention, the camera 112 may be a Raspberry Pi Cam2. Embodiments of the present invention are intended to include or otherwise cover any type of the camera 112, including known, related art, and/or later developed technologies.
[0066] FIG. 1D illustrates a back view of the unmanned air vehicle 102, according to an embodiment of the present invention. According to an embodiment of the present invention, the frame 104 of the unmanned air vehicle 102 may be constructed of material, such as, but not limited to, a carbon fiber material, a plastic material, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the frame 104, including known, related art, and/or later developed technologies.
[0067] FIG. 1E illustrates a right view of the unmanned air vehicle 102, according to an embodiment of the present invention. In an embodiment of the present invention, the propellers 106a-106d may be driven through the motors 108a-108d. Further, the rotation of the propellers 106a-106d may be controlled and manipulated using the electronic speed controller 110 and the flight controller 120. The controlling and manipulation of the propellers 106a-106d using the electronic speed controller 110 and the flight controller 120 may determine a speed and a direction of the unmanned air vehicle 102.
[0068] FIG. 1F illustrates a left view of the unmanned air vehicle 102, according to an embodiment of the present invention. In an exemplary embodiment, if the unmanned air vehicle 102 tends to turn left, then the electronic speed controller 110 and the flight controller 120 may slow down the rotation of the propellor 106a and the propellor 106b. However, simultaneously, the electronic speed controller 110 and the flight controller 120 may increase the rotations of the propellor 106c and the propellor 106d. Leading the unmanned air vehicle 102 to tilt left and turn in a left direction.
[0069] FIG. 1G illustrates a bottom view of the unmanned air vehicle 102, according to an embodiment of the present invention. In another exemplary embodiment, if the unmanned air vehicle 102 tends to turn right, then the electronic speed controller 110 and the flight controller 120 may slow down the rotation of the propellor 106c and the propellor 106d. However, simultaneously, the electronic speed controller 110 and the flight controller 120 may increase the rotations of the propellor 106a and the propellor 106c. Leading the unmanned air vehicle 102 to tilt right and turn in a right direction.
[0070] FIG. 2 illustrates a block diagram of the control unit 118 of the system 100, according to an embodiment of the present invention. The control unit 118 may comprise the computer-executable instructions in form of programming modules such as a data receiving module 200, a route mapping module 202, a flying module 204, a data capturing module 206, and an alert module 208.
[0071] In an embodiment of the present invention, the data receiving module 200 may be configured to receive the Save Our Soul (SOS) data packet from the user device 124. The received Save Our Soul (SOS) data packet from the user device 124 may comprise the destination location of the person of interest. Further, as the Save Our Soul (SOS) data packet may be received from the user device 124, the data receiving module 200 may be configured to initiate distress response protocols. As the distress response protocols may be initiated, the data receiving module 200 may transmit the destination location of the person of interest to the route mapping module 202.
[0072] The route mapping module 202 may be activated upon receipt of the destination location of the person of interest from the data receiving module 200. In an embodiment of the present invention, the route mapping module 202 may be configured to map a route from a source location to the destination location. The source location may be a geographical location where a fleet of unmanned air vehicles may be hanged. Further, one of the unmanned air vehicle 102 from the fleet may be fed with the destination location of the person of interest along with the mapped route that may be flown by the unmanned air vehicle 102.
[0073] Upon mapping of the route, and transmission of the mapped route to the unmanned air vehicle 102, the route mapping module 202 may be configured to transmit a flying signal to the flying module 204.
[0074] The flying module 204 may be activated upon receipt of the flying signal from the route mapping module 202. In an embodiment of the present invention, the flying module 204 may be configured to activate the electronic speed controller 110 to initiate a power delivery to the motors 108a-108d, so that the motors 108a-108d may actuate the propellers 106a-106d. Further, the flying module 204 may be configured to actuate the flight controller 120 to initiate the flight of the unmanned air vehicle 102 from the source location to the destination location on the mapped route.
[0075] As the unmanned air vehicle 102 may arrive at the destination location, the flying module 204 may transmit a surveillance signal to the data capturing module 206.
[0076] The data capturing module 206 may be activated upon receipt of the surveillance signal from the flying module 204. The data capturing module 206 may be configured to enable the camera 112 for capturing the visuals of the person of interest at the destination location. Further, the data capturing module 206 may be configured to enable the camera 112 to actively track the person of interest at the destination location.
[0077] In another embodiment of the present invention, the data capturing module 206 may further be configured to transmit the captured visuals of the person of interest along with an address tag of the destination location to the law enforcement bodies. After capturing/recording the person of interest, and sharing with the law enforcement bodies, the data capturing module 206 may transmit an alert signal to the alert module 208.
[0078] The alert module 208 may be activated upon receipt of the alert signal from the alert module 208. In an embodiment of the present invention, the alert module 208 may be configured to actuate the relay 114 for activating the siren 116 when the unmanned air vehicle 102 arrives at the destination location. The siren 116 activated by actuation of the relay 114 may emit sound waves to audibly alert the person of interest and the persons in the proximity of the person of interest to intervene and help the person of interest in the distressed times.
[0079] FIG. 3 depicts a flowchart of a method 300 for conducting an over the air surveillance and security using the unmanned air vehicle 102, according to an embodiment of the present invention.
[0080] At step 302, the system 100 may enable the unmanned air vehicle 102 to receive the Save Our Soul (SOS) data packet from the user device 124. The Save Our Soul (SOS) data packet comprises a destination location of the person of interest.
[0081] At step 304, the system 100 may enable the unmanned air vehicle 102 to map the route from the source location to the destination location.
[0082] At step 306, the system 100 may enable the unmanned air vehicle 102 to activate the electronic speed controller 110 to initiate the power delivery to the motors 108a-108d. Further, the motors 108a-108d are adapted to actuate the propellers 106a-106d.
[0083] At step 308, the system 100 may enable the unmanned air vehicle 102 to actuate the flight controller 120 to initiate the flight of the unmanned air vehicle 102 from the source location to the destination location on the mapped route.
[0084] At step 310, the system 100 may enable the unmanned air vehicle 102 to enable the camera 112 to capture the visuals of the person of interest at the destination location.
[0085] At step 312, the system 100 may analyze the captured visuals of the person of interest.
[0086] At step 314, the system 100 may detect the emergency and a sense of distress from the analyzed visuals of the person of interest. If the emergency or a sense of distress may be detected in the analyzed visuals, then the method 300 may proceed to a step 316. Else, the method 300 may revert to the step 310.
[0087] At step 316, the system 100 may enable the communication unit 126 to transmit the captured visuals of the person of interest to the law enforcement bodies.
[0088] At step 318, the system 100 may enable the unmanned air vehicle 102 to actuate the relay 114 for activating the siren 116 when the emergency may be detected in the visuals of the person of interest.
[0089] While the invention has been described in connection with what is presently considered to be the most practical and various embodiments, it is to be understood that the invention is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims.
[0090] This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined in the claims and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements within substantial differences from the literal languages of the claims. , Claims:CLAIMS
I/We Claim:
1. An over the air surveillance and security system (100) integrated with an unmanned air vehicle (102), the system (100) comprising:
propellers (106a-106d) adapted to create a thrust to lift the unmanned air vehicle (102), wherein the propellers are driven using motors (108a-108d), such that rotations of the motors (108a-108d) are controlled using an electronic speed controller (110);
a camera (112) adapted to capture visuals of a person of interest while on surveillance; and
a control unit (118) communicatively connected to the electronic speed controller (110) and to the camera (112), characterized in that the control unit (118) is configured to:
receive a Save Our Soul (SOS) data packet from a user device (124), wherein the Save Our Soul (SOS) data packet comprises a destination location of the person of interest;
map a route from a source location to the destination location;
activate the electronic speed controller (110) to initiate a power delivery to the motors (108a-108d), such that the motors (108a-108d) are adapted to actuate the propellers (106a-106d);
actuate a flight controller (120) to initiate a flight of the unmanned air vehicle (102) from the source location to the destination location on the mapped route;
enable the camera (112) to capture the visuals of the person of interest at the destination location;
transmit the captured visuals of the person of interest to law enforcement bodies when an emergency is detected.
2. The system (100) as claimed in claim 1, wherein the control unit (118) is configured to actuate a relay (114) for activating a siren (116) when the unmanned air vehicle (102) arrives at the destination location.
3. The system (100) as claimed in claim 1, comprising a communication unit (126) adapted to transmit the Save Our Soul (SOS) data packet from the user device (124) to the unmanned air vehicle (102).
4. The system (100) as claimed in claim 1, wherein the transmission of the Save Our Soul (SOS) data packet is initiated from the user device (124) of the person of interest.
5. The system (100) as claimed in claim 1, wherein the control unit (118) is configured to analyze the captured visuals of the person of interest to detect the emergency.
6. The system (100) as claimed in claim 1, comprising a power supply unit (122) adapted to supply operational power to the components of the unmanned air vehicle (102).
7. The system (100) as claimed in claim 1, wherein the control unit (118) is configured to activate the flight controller (120) to aerially track the person of interest.
8. A method (300) for conducting an over the air surveillance and security, the method (300) is characterized by steps of:
receiving a Save Our Soul (SOS) data packet from a user device (124);
mapping a route from a source location to a destination location of a person of interest;
activating an electronic speed controller (110) to initiate a power delivery to motors (108a-108d) that actuates propellers (106a-106d);
actuating a flight controller (120) to initiate a flight of an unmanned air vehicle (102) from the source location to the destination location on the mapped route;
enabling a camera (112) to capture the person of interest at the destination location; and
transmitting the captured visuals of the person of interest to law enforcement bodies when an emergency is detected.
9. The method (300) as claimed in claim 8, further comprises a step analyzing the captured visuals of the person of interest to detect the emergency.
10. The method (300) as claimed in claim 8, further comprises a step of actuating a relay (114) for activating a siren (116) when the unmanned air vehicle (102) arrives at the destination location.
Date: November 19, 2024
Place: Noida
Nainsi Rastogi
Patent Agent (IN/PA-2372)
Agent for the Applicant
Documents
Name | Date |
---|---|
202441091303-COMPLETE SPECIFICATION [23-11-2024(online)].pdf | 23/11/2024 |
202441091303-DECLARATION OF INVENTORSHIP (FORM 5) [23-11-2024(online)].pdf | 23/11/2024 |
202441091303-DRAWINGS [23-11-2024(online)].pdf | 23/11/2024 |
202441091303-EDUCATIONAL INSTITUTION(S) [23-11-2024(online)].pdf | 23/11/2024 |
202441091303-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [23-11-2024(online)].pdf | 23/11/2024 |
202441091303-FORM 1 [23-11-2024(online)].pdf | 23/11/2024 |
202441091303-FORM FOR SMALL ENTITY(FORM-28) [23-11-2024(online)].pdf | 23/11/2024 |
202441091303-FORM-9 [23-11-2024(online)].pdf | 23/11/2024 |
202441091303-OTHERS [23-11-2024(online)].pdf | 23/11/2024 |
202441091303-POWER OF AUTHORITY [23-11-2024(online)].pdf | 23/11/2024 |
202441091303-REQUEST FOR EARLY PUBLICATION(FORM-9) [23-11-2024(online)].pdf | 23/11/2024 |
Talk To Experts
Calculators
Downloads
By continuing past this page, you agree to our Terms of Service,, Cookie Policy, Privacy Policy and Refund Policy © - Uber9 Business Process Services Private Limited. All rights reserved.
Uber9 Business Process Services Private Limited, CIN - U74900TN2014PTC098414, GSTIN - 33AABCU7650C1ZM, Registered Office Address - F-97, Newry Shreya Apartments Anna Nagar East, Chennai, Tamil Nadu 600102, India.
Please note that we are a facilitating platform enabling access to reliable professionals. We are not a law firm and do not provide legal services ourselves. The information on this website is for the purpose of knowledge only and should not be relied upon as legal advice or opinion.