image
image
user-login
Patent search/

LIDAR-BASED DIMENSIONAL INSPECTION DEVICE

search

Patent Search in India

  • tick

    Extensive patent search conducted by a registered patent agent

  • tick

    Patent search done by experts in under 48hrs

₹999

₹399

Talk to expert

LIDAR-BASED DIMENSIONAL INSPECTION DEVICE

ORDINARY APPLICATION

Published

date

Filed on 30 October 2024

Abstract

The present disclosure relates to an inspection device (100) that includes a Y-direction stand (102-1) configured to stabilize a first sensor during movement along a Y-axis. An X-direction stand (102-2) is configured to support the movement of a second sensor along an X-axis, wherein corresponding sensors emit laser pulses and measure the time of flight of reflections to determine distances within a 2D plane. A first servo motor (110-1) integrated with the Y-direction stand for driving vertical movement of the first sensor, enabling 180-degree rotation. A second servo motor (110-2) integrated with the X-direction stand for driving horizontal movement of the second sensor, enabling 360-degree rotation. A spur gear mechanism (112) for transmitting rotational motion from the corresponding servo motors to the respective stands, wherein the combination of 360-degree rotation along the x-axis and 180-degree rotation along the y-axis allows for three-dimensional data acquisition and enhanced spatial coverage.

Patent Information

Application ID202441083419
Invention FieldELECTRONICS
Date of Application30/10/2024
Publication Number45/2024

Inventors

NameAddressCountryNationality
T. SHANMUGAPRIYANPG Student, School of Mechanical Engineering, Vellore Institute of Technology, Chennai, Vandalur - Kelambakkam Road, Chennai, Tamil Nadu - 600127, India.IndiaIndia
S. EDWARD JEROAssistant Professor, School of Computer Science Engineering, Vellore Institute of Technology, Chennai, Vandalur - Kelambakkam Road, Chennai, Tamil Nadu - 600127, India.IndiaIndia
G. MURALI MOHANAssociate Professor, School of Mechanical Engineering, Vellore Institute of Technology, Chennai, Vandalur - Kelambakkam Road, Chennai, Tamil Nadu - 600127, India.IndiaIndia

Applicants

NameAddressCountryNationality
VELLORE INSTITUTE OF TECHNOLOGY, CHENNAIVandalur - Kelambakkam Road, Chennai, Tamil Nadu - 600127, India.IndiaIndia

Specification

Description:TECHNICAL FIELD
[0001] The present disclosure relates, in general, to Light Detection and Ranging (LiDAR) systems, and more specifically, relates to a dimensional inspection device utilizing a two-dimensional LiDAR sensor integrated with a robust support structure for enhanced stability, accuracy, and versatility in capturing three-dimensional environmental data.

BACKGROUND
[0002] Traditional 2D image sensors are commonly employed in cameras and imaging devices to capture visual information in a flat, two-dimensional format. However, these sensors lack the capability to capture depth information, which makes it difficult to discern spatial relationships between objects in a scene. This limitation hampers context understanding, particularly in applications that require awareness of object distance and spatial interaction, such as robotics and augmented reality.
[0003] Stereo vision systems, which utilize two or more cameras positioned at different angles, have been introduced to infer depth by analyzing the disparities between captured images. However, these systems are subject to several drawbacks, including the need for complex calibration and precise alignment of cameras, which can be time-consuming and prone to error. Additionally, stereo vision systems struggle with textureless or low-contrast surfaces and variations in lighting conditions, affecting the accuracy of depth perception. Furthermore, the computational demands for calculating depth in real-time can burden system performance, particularly in high-speed applications.
[0004] Structured light systems offer another approach to depth sensing by projecting a known light pattern onto a scene and capturing the deformation of the pattern to infer depth information. Despite their utility, these systems are highly sensitive to ambient lighting conditions, which can interfere with the projected pattern and degrade performance in variable environments. Reflective or transparent surfaces further complicate the capture of depth information, as they can distort the light pattern. Moreover, structured light systems require a clear line of sight to the object, limiting their effectiveness in cluttered or occluded environments.
[0005] Time-of-Flight (ToF) cameras represent a more advanced depth-sensing technology by measuring the time it takes for light pulses to travel from the camera to an object and back. This data is used to calculate depth with a relatively high degree of precision. However, ToF cameras are also subject to several limitations, including susceptibility to interference from strong ambient light, which can reduce their accuracy. Additionally, the range and accuracy of ToF cameras decrease with distance, making them less effective in long-range applications. The relatively high cost of ToF cameras further limits their widespread adoption, particularly in cost-sensitive industries.
[0006] The existing system suffers from limitations that include a lack of depth information in 2Dsensors, complex calibration and setup in stereo vision systems, sensitivity to ambient light and surface issues in structured light and ToF systems, high cost and bulkiness of traditional LiDAR systems, computational load and real-time performance, and limited context understanding in individual technologies.
[0007] Therefore, it is desired to overcome the drawbacks, shortcomings, and limitations associated with existing solutions, and develop an improved device that can provide accurate, reliable, and efficient three-dimensional data capture in various environments and conditions.

OBJECTS OF THE PRESENT DISCLOSURE
[0008] An object of the present disclosure is to provide a device that offers extensive rotational flexibility, allowing 360° rotation on the x-axis and 180° rotation on the y-axis, enhancing spatial coverage for 3D scanning.
[0009] Another object of the present disclosure is to provide a device that integrates LiDAR point clouds with visual data, improving spatial understanding by aligning depth information with image planes.
[0010] Another object of the present disclosure is to provide a device that features a compact and portable design, enabling easy deployment and transportation across various operational settings.
[0011] Another object of the present disclosure is to provide a device that visualizes LiDAR data as 3D point clouds and overlays them on 2D images, ensuring precise environmental mapping for enhanced decision-making.
[0012] Another object of the present disclosure is to provide a device that includes modular components and user-friendly interfaces, allowing for quick setup and adjustment to suit different applications.
[0013] Yet another object of the present disclosure is to provide a highly versatile device, making it suitable for applications in robotics, automated driving, surveillance, industrial automation, and augmented reality, offering wide-ranging benefits across industries.

SUMMARY
[0014] The present disclosure relates to LiDAR systems and more specifically, relates to a dimensional inspection device. The main objective of the present disclosure is to overcome the drawbacks, limitations, and shortcomings of the existing device and solution, by providing a dimensional inspection device for capturing accurate three-dimensional data using a 2D LiDAR device, wherein the 2D LiDAR device is configured with an initial degree of freedom, enabling it to measure distance within a two-dimensional plane by emitting laser pulses and measuring the time of flight for reflections. The device is further configured to rotate 360 degrees on the x-axis and 180 degrees on the y-axis, thereby expanding the coverage area of the LiDAR device significantly. The enhanced design of the LiDAR device facilitates extensive rotational capabilities, providing expanded coverage in multiple dimensions. Additionally, the device is compact and portable, rendering it suitable for diverse applications and easy deployment in varying environments. The device is further configured to visualize LiDAR data as three-dimensional point clouds, align LiDAR points with image planes, and overlay the LiDAR points on two-dimensional images to improve spatial understanding. The proposed device provides a robust solution for industries such as robotics, automated driving, surveillance, industrial automation, and augmented reality, offering enhanced perception and decision-making in real-world applications.
[0015] The present disclosure relates to an inspection device comprising a Y-direction stand configured to stabilize a first sensor during movement along a Y-axis and an X-direction stand configured to support movement of a second sensor along an X-axis, wherein the corresponding sensors emit laser pulses and measure the time of flight of reflections to determine distances within a two-dimensional (2D) plane. The device further includes a first servo motor integrated with the Y-direction stand, enabling 180-degree vertical rotation of the first sensor, and a second servo motor integrated with the X-direction stand, enabling 360-degree horizontal rotation of the second sensor. A spur gear mechanism transmits rotational motion from the respective servo motors to the corresponding stands. The device also includes an extender connecting the first and second sensors to a moving platform, thereby allowing increased reach and flexibility for scanning operations.
[0016] The combination of 360-degree rotation along the x-axis and 180-degree rotation along the y-axis enables three-dimensional data acquisition and enhanced spatial coverage. In one embodiment, the first and second sensors are light detection and ranging (LiDAR) sensors. The first servo motor in the Y-direction stand is a high-precision motor capable of operating at an average speed of approximately 353.77 degrees per second, while the second servo motor for the X-direction stand enables rapid adjustments, allowing full 360-degree rotation in approximately 1.0188 seconds when powered at 4.8V.
[0017] The spur gear mechanism is configured to minimize backlash and optimize torque transfer for smooth motion transfer. Additionally, a protective top lid shields the sensors from environmental factors to ensure reliable operation. The Y-direction and X-direction stands together provide comprehensive spatial data capture capabilities. The device also includes an electronic case to protect the electronic components of the sensors from environmental damage. In a further embodiment, the spur gear mechanism comprises a driving gear operatively connected to the Y-direction stand, enabling 180-degree rotation about the x-axis in a y-plane, and a driven gear operatively connected to the X-direction stand, enabling 360-degree rotation about the y-axis in an x-plane.
[0018] Various objects, features, aspects, and advantages of the inventive subject matter will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like components.

BRIEF DESCRIPTION OF THE DRAWINGS
[0019] The following drawings form part of the present specification and are included to further illustrate aspects of the present disclosure. The disclosure may be better understood by reference to the drawings in combination with the detailed description of the specific embodiments presented herein.
[0020] FIG. 1A illustrates an exemplary assembly for a dimensional inspection device, in accordance with an embodiment of the present disclosure.
[0021] FIG. 1B illustrates an exemplary Y-direction assembly for a LiDAR device, in accordance with an embodiment of the present disclosure.
[0022] FIG. 1C illustrates an exemplary X-direction assembly for a LiDAR device, in accordance with an embodiment of the present disclosure.
[0023] FIGs. 2A to 2C illustrate different views of a LiDAR device, in accordance with an embodiment of the present disclosure.
[0024] FIGs. 3A to 3H illustrate exploded views of a LiDAR device, in accordance with an embodiment of the present disclosure.
[0025] FIG. 4 illustrates geometric calculation for rotation, in accordance with an embodiment of the present disclosure.
[0026] FIGs. 5A and FIG. 5B illustrate Y-direction assembly with dimensions and X-direction assembly with dimensions of the device, in accordance with an embodiment of the present disclosure.

DETAILED DESCRIPTION
[0027] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such detail as to clearly communicate the disclosure. If the specification states a component or feature "may", "can", "could", or "might" be included or have a characteristic, that particular component or feature is not required to be included or have the characteristic.
[0028] As used in the description herein and throughout the claims that follow, the meaning of "a," "an," and "the" includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein, the meaning of "in" includes "in" and "on" unless the context clearly dictates otherwise.
[0029] The problem of traditional 2D image sensors capturing flat, two-dimensional representations without depth information is solved by integrating LiDAR technology, which provides precise depth measurements. The combination of LiDAR data with 2D images results in a comprehensive three-dimensional understanding of the environment, crucial for applications such as autonomous navigation, 3D modeling, and augmented reality.
[0030] The complexity and time-consuming nature of aligning and calibrating multiple cameras in stereo vision devices, which can lead to inaccurate depth measurements, is mitigated by employing a single, compact LiDAR sensor i.e., TF Luna Mini-S in conjunction with a standard 2D camera. The TF Luna Mini-S is a compact Time-of-Flight (ToF) LiDAR sensor. This integration simplifies the overall device design and enhances reliability by reducing the complexity of setup and calibration. The sensitivity of structured light devices to ambient lighting conditions, coupled with the challenges faced by both structured light and Time-of-Flight (ToF) devices in accurately measuring depth on reflective or transparent surfaces, is addressed by utilizing LiDAR technology, which operates effectively in various lighting conditions. The combination of LiDAR with a 2D camera allows for improved handling of reflective surfaces, ensuring more reliable performance across different environments.
[0031] The high cost and bulkiness associated with traditional LiDAR devices, which limits their use in smaller or cost-sensitive applications, are overcome by employing the compact and affordable TF Luna Mini-S LiDAR sensor. This solution renders the integrated device cost-effective and practical for a broader range of applications, promoting wider adoption in fields such as robotics, automated driving, and surveillance. The computational intensity of depth calculation in stereo vision devices and data processing in structured light and ToF devices, which negatively impacts real-time performance, is alleviated through the use of efficient algorithms in Python and OpenCV for data processing and synchronization. This ensures real-time performance critical for applications requiring immediate feedback, such as autonomous navigation and real-time surveillance.The limitation of individual technologies, such as LiDAR or 2D cameras, which either provide depth information or visual context but not both, is resolved by integrating LiDAR's depth data with visual information from 2D cameras. This fusion provides a holistic view of the environment, enhancing context understanding and spatial awareness, thereby improving decision-making in complex scenarios such as autonomous vehicles and robotic devices. The present disclosure can be described in enabling detail in the following examples, which may represent more than one embodiment of the present disclosure.
[0032] The advantages achieved by the device of the present disclosure can be clear from the embodiments provided herein. The present disclosure addresses the limitation of traditional 2D image sensors, which capture two-dimensional representations of a scene and lack depth information, by integrating LiDAR technology to provide precise depth measurements, resulting in a comprehensive three-dimensional understanding of the environment, which is essential for applications such as autonomous navigation, 3D modeling, and augmented reality. The present disclosure provides a solution to the complex calibration and setup required in stereo vision devices, which involve precise alignment of multiple cameras, by utilizing a single LiDAR sensor, such as the TF Luna Mini-S, in conjunction with a standard 2D camera, simplifying device design, reducing calibration complexity, and enhancing device reliability. The present disclosure addresses the sensitivity of structured light and ToF devices to ambient lighting conditions and reflective or transparent surfaces by incorporating LiDAR technology, which is less affected by ambient light, thereby enabling the device to operate effectively in diverse lighting conditions and providing reliable performance in handling reflective surfaces. The present disclosure provides a cost-effective and compact alternative to traditional bulky and expensive LiDAR devices by employing the TF Luna Mini-S LiDAR sensor, enabling the use of LiDAR technology in smaller or cost-sensitive applications such as robotics, automated driving, and surveillance, thereby promoting wider adoption across various fields. The present disclosure addresses the computational intensity and real-time performance issues of stereo vision devices and structured light or ToF devices by implementing efficient algorithms using Python and OpenCV for data processing and synchronization, thereby ensuring real-time performance, which is critical for applications requiring immediate feedback, such as autonomous navigation and real-time surveillance. The present disclosure enhances overall context understanding and spatial awareness by integrating depth information from LiDAR with visual data from 2D cameras, providing a holistic view of the environment and improving decision-making processes in complex scenarios such as autonomous vehicles and robotic devices.
[0033] The description of terms and features related to the present disclosure shall be clear from the embodiments that are illustrated and described; however, the invention is not limited to these embodiments only. Numerous modifications, changes, variations, substitutions, and equivalents of the embodiments are possible within the scope of the present disclosure. Additionally, the invention can include other embodiments that are within the scope of the claims but are not described in detail with respect to the following description.
[0034] FIG. 1A illustrates an exemplary assembly for a dimensional inspection device, in accordance with an embodiment of the present disclosure.
[0035] Referring to FIG. 1A, inspection device 100 (also referred to as device 100, herein) can include a Y-direction stand 102-1 configured to stabilize a first sensor 104-1 during movement along a Y-axis and an X-direction stand 102-2 configured to support movement of a second sensor 104-2 along an X-axis, wherein the corresponding sensors (104-1, 104-2) emit laser pulses and measure the time of flight of reflections to determine distances within a two-dimensional (2D) plane. The device 100 further includes a first servo motor 110-1 integrated with the Y-direction stand 102-1, enabling 180-degree vertical rotation of the first sensor 104-1, and a second servo motor 110-2 integrated with the X-direction stand 102-2, enabling 360-degree horizontal rotation of the second sensor 104-2.
[0036] A spur gear mechanism 112 transmits rotational motion from the respective servo motors (110-1, 110-2) to the corresponding stands. The device 100 also includes an extender 108 connecting the first and second sensors (104-1, 104-2) to a moving platform, thereby allowing increased reach and flexibility for scanning operations. The combination of 360-degree rotation along the x-axis and 180-degree rotation along the y-axis enables three-dimensional data acquisition and enhanced spatial coverage.
[0037] In one embodiment, the first and second sensors (104-1, 104-2) are light detection and ranging (LiDAR) sensors. The first servo motor 110-1 in the Y-direction stand 102-1 is a high-precision motor capable of operating at an average speed of approximately 353.77 degrees per second, while the second servo motor 110-2 for the X-direction stand 102-2 enables rapid adjustments, allowing full 360-degree rotation in approximately 1.0188 seconds when powered at 4.8V. The spur gear mechanism 112 is configured to minimize backlash and optimize torque transfer for smooth motion transfer. Additionally, a protective top lid 114 shields the sensors from environmental factors to ensure reliable operation. The Y-direction and X-direction stands together provide comprehensive spatial data capture capabilities. The device 100 also includes an electronic case 116 to protect the electronic components of the sensors from environmental damage. In a further embodiment, the spur gear mechanism 112 can include a driving gear operatively connected to the Y-direction stand 102-1, enabling 180-degree rotation about the x-axis in a y-plane, and a driven gear operatively connected to the X-direction stand 102-2, enabling 360-degree rotation about the y-axis in an x-plane.
[0038] The device 100 can include a Y-direction assembly capable of facilitating 180-degree rotation about the x-axis in the y-plane and an X-direction assembly designed for 360-degree rotation about the y-axis in the x-plane. The Y-direction assembly is equipped with a motor mechanism that allows for precise rotational movement, enabling the LiDAR device to effectively capture detailed three-dimensional information from various angles. Meanwhile, the X-direction assembly enhances spatial coverage by providing full rotational capabilities, thereby significantly improving data acquisition efficiency.
[0039] FIG. 1B illustrates an exemplary Y-direction assembly for a LiDAR device, in accordance with an embodiment of the present disclosure. The Y-direction assembly for the LiDAR device 100, comprising the servo motor 110-1 and the stand 102-1, which facilitates 180-degree rotation along the y-axis, thereby complementing the X-direction assembly and further enhancing spatial coverage and data acquisition capabilities.
[0040] The servo motor 110-1 is integrated into the Y-direction assembly and operates at an average speed of approximately 353.77 degrees per second, enabling rapid and accurate adjustments necessary for effective data capture. This motor 110-1 is capable of achieving a 180-degree rotation in approximately 0.5094 seconds when powered at 4.8V, ensuring quick response times during operation. In this assembly, the servo motor 110 is directly connected to the stand, providing a straightforward and effective means of achieving rotational motion. This direct connection allows for efficient torque transfer from the motor to the stand, ensuring smooth and reliable rotation without the need for additional mechanical components, such as spur gears. The design of the Y-Direction assembly enables partial rotation in the vertical plane, allowing the LiDAR device to scan and capture detailed three-dimensional information from various angles. This capability enhances the overall functionality of the LiDAR device, making it a robust tool for applications requiring comprehensive spatial data acquisition.
[0041] The spur gear mechanism 112 incorporates a combination of gears that facilitate dual-axis rotation, ensuring accurate positioning of the LiDAR sensor in both the horizontal and vertical planes. The arrangement of these gears is optimized for efficient torque transfer and minimal friction, thereby enhancing the responsiveness and stability of the LiDAR device during operation.
[0042] FIG. 1C illustrates an exemplary X-direction assembly for a LiDAR device, in accordance with an embodiment of the present disclosure. sReferring to FIG. 1C, X-direction assembly for the LiDAR device 100, comprising the second servo motor 110-2, a spur gear mechanism 112, and a rotating platform, which together enable full 360-degree rotation along the x-axis, thereby significantly enhancing the spatial coverage and data acquisition capabilities of the LiDAR device.
[0043] In an embodiment, the second servo motor 110-2 is configured to operate at an average speed of approximately 353.77 degrees per second, facilitating rapid and accurate adjustments necessary for efficient scanning. In an exemplary embodiment, the second servo motor 110-2 can be MG995 servo motor. This second motor 110-2 is capable of achieving a full 360-degree rotation in approximately 1.0188 seconds when powered at 4.8V, ensuring swift and effective scanning of the environment.
[0044] The spur gear mechanism 112 is integrated into the assembly to transmit rotational motion from the MG995 servo motor 110-2 to the rotating platform. These spur gears mesh seamlessly to efficiently transfer torque, thereby ensuring smooth and reliable movement during operation. The spur gear mechanism 112 operates about the y-axis in the x-plane, translating the rotational motion provided by the servo motor 110-2 into precise control over the positioning of the rotating platform. The structure ensures stability and alignment during the rotation process, minimizing any potential deviations that may affect data accuracy. By combining these components, the X-direction assembly allows for comprehensive spatial data capture from all directions, making the LiDAR device a versatile and powerful tool for various applications, including robotics, automated driving, surveillance, industrial automation, and augmented reality.
[0045] The spur gear mechanism 112 includes a series of interconnected spur gears, which efficiently transmit rotational motion from the second servo motor 110-2 to the LiDAR sensor, facilitating accurate positioning and alignment during operation. The device of the spur gear mechanism 112 ensures minimal backlash and optimal torque transfer, thereby enhancing the overall performance and reliability of the LiDAR device in capturing three-dimensional data.
[0046] FIGs. 2A to 2C illustrate different views of a LiDAR device, in accordance with an embodiment of the present disclosure. FIG. 2A illustrates an isometric view of the LiDAR device assembly, highlighting the structural components and arrangement of the Y-direction and X-direction assemblies. FIG. 2B presents a front view of the LiDAR device assembly, showcasing the frontal configuration and alignment of the motor mechanism and sensor housing, thereby illustrating the accessibility of the components for maintenance and calibration.
[0047] FIG. 2C displays a side view of the LiDAR device assembly, further demonstrating the compact design and spatial relationships between the various assemblies, including the servo motors and support structure. This comprehensive visual representation emphasizes the assembly's functional design, ensuring effective operation for capturing three-dimensional data across various applications, such as robotics, automated driving, surveillance, industrial automation, and augmented reality.
[0048] FIGs. 3A to 3H illustrate exploded views of a LiDAR device, in accordance with an embodiment. Referring to FIG. 3A, the Y-direction stand 102-1 comprises a robust vertical support structure configured to stabilize the first LiDAR sensor 104-1 during movement along the Y-axis. The Y-direction stand 102-1 ensures precise vertical alignment while minimizing vibration, providing a secure base for the vertical motion device. It is designed to accommodate the weight and dynamic forces exerted by the moving sensor, thereby enhancing stability and accuracy. The stand 102-1 is typically constructed from durable materials, such as aluminum or steel, to ensure long-term structural integrity and reliable performance.
[0049] FIG. 3B depicts X-direction stand 102-2 comprises a horizontal support structure configured to provide stability for the LiDAR sensor's movement along the X-axis. The stand 102-2 ensures accurate lateral positioning with minimal deflection, thereby maintaining precise alignment during horizontal movements. Constructed from sturdy materials, the stand 102-2 is configured to support the second sensor 104-2 and its associated components, integrating seamlessly with the overall device to enhance structural integrity and operational reliability.
[0050] FIG. 3C depicts extender 108 of the LiDAR sensors (104-1, 104-2) comprises a mechanical arm or extension configured to connect the LiDAR sensor to the moving platform. The extender 108 allows for increased reach and flexibility in positioning the sensor, ensuring that the sensor can cover a wider area during scanning operations. Designed to maintain stability and precision, the extender 108 is often adjustable to accommodate varying operational requirements, thereby enhancing the device's overall adaptability and functionality.
[0051] FIG. 3D depicts the first servo motor 110-1 for the Y-direction a high-precision motor configured to drive the vertical movement of the LiDAR sensor. The motor 110-1 enables accurate adjustments along the Y-axis, providing fine control for precise positioning within the vertical plane. The servo motor 110-1 is programmable for various motion profiles, ensuring smooth and reliable operation under diverse conditions, and is critical for maintaining accurate vertical alignment of the LiDAR sensor.
[0052] FIG. 3E depicts, the servo motor 110-2 for the X-direction is a high-precision motor configured to drive the horizontal movement of the LiDAR sensor. The motor provides controlled and accurate positioning along the X-axis, enabling fine-tuned adjustments during scanning processes. It is typically programmable for various motion control profiles, ensuring consistent performance and smooth operation in horizontal alignment and movement of the LiDAR sensor.
[0053] FIG. 3F the spur gear mechanism 112 can include driven spur gear connected to X-direction stand 102-2 and driving spur gear connected to Y-direction stand 102-1. The driven spur gear for the X-direction is configured to transmit rotational motion from the servo motor along the X-axis. This device ensures smooth and efficient motion transfer, handling the required torque and speed with minimal backlash. The small spur gear, functioning as the driving gear connected to Y-direction stand, is connected to the servo motor, while the large spur gear, acting as the driven gear, is connected to the X-direction stand, facilitating 360-degree rotation about the y-axis in the x-plane.
[0054] FIG. 3G the top lid 114 for the LiDAR device is a protective cover designed to shield the LiDAR sensor and its components from environmental factors such as dust, moisture, and other contaminants. This top lid 114 plays a crucial role in maintaining the integrity and reliability of the sensor, while also being designed for easy access and maintenance. It is typically made from lightweight, durable materials.
[0055] FIG. 3H, the electronic case 116 is a housing unit that contains and protects the electronic components of the LiDAR device. It ensures safe and organized operation of the electronics, safeguarding sensitive components from environmental damage and interference. The electronic case is designed with cooling and access features for maintenance, contributing to the overall durability and functionality of the device.
[0056] The present disclosure provides enhanced depth perception by integrating LiDAR technology with 2D cameras, combining precise depth measurements from LiDAR with visual data to offer a comprehensive three-dimensional understanding of the environment, addressing the limitations of traditional 2D sensors and improving spatial perception. The present disclosure simplifies device setup and calibration by utilizing a single, compact LiDAR sensor, such as the TF Luna Mini-S, thereby reducing the complexity and calibration requirements associated with stereo vision devices that require multiple cameras, resulting in a device that is easier to deploy and maintain.
[0057] The present disclosure provides robust performance under varying environmental conditions by leveraging the combined strengths of LiDAR and cameras, wherein LiDAR excels in low-light conditions, while the camera offers visual context in environments where LiDAR may encounter challenges with reflective surfaces, ensuring reliable operation. The present disclosure offers a cost-effective and practical solution by utilizing the affordable TF Luna Mini-S LiDAR sensor in conjunction with widely accessible tools such as Python and OpenCV, making the device more economical compared to traditional, bulkier LiDAR devices, and promoting broader adoption across various industries.
[0058] The present disclosure ensures real-time processing by employing efficient algorithms using Python and OpenCV for data synchronization and processing, enabling the device to provide immediate feedback in applications such as autonomous navigation and real-time surveillance, where timely responses are critical. The present disclosure provides a versatile and adaptable solution for a wide range of applications, including robotics, automated driving, and surveillance, by addressing the limitations of individual sensors and offering a holistic device that integrates the strengths of LiDAR and cameras, making it suitable for various real-world scenarios.
[0059] FIG. 4 illustrates geometric calculation for rotation, in accordance with an embodiment of the present disclosure. The device for converting spherical coordinates, comprising radius (r), azimuthal angle (ϕ), and elevation angle (θ), into Cartesian coordinates (x, y, z) for accurate plotting in a computational device, wherein the conversion is performed using the formulas: x = r sinθ cosϕ, y = r sinθ sinθ, and z = r cosθ. The method further includes modifying the resultant Cartesian coordinates based on the differences between the spherical coordinate device and the computational coordinate device, such that specific values of x, y, and z are adjusted to align with the computer's axis orientation, where x represents the horizontal axis and y represents the vertical axis with positive direction downward.
[0060] The device for calculating the speed and time characteristics of the MG995 servo motor, wherein the motor operates at approximately 0.17 seconds per 60° rotation when powered at 4.8 V without load. The method includes calculating the time required for a 1° rotation as approximately 0.00283 seconds (or 2.83 milliseconds). For a full 180° rotation, the method calculates the total time required as 0.5094 seconds. Additionally, the rotational speed is determined to be approximately 353.77° per second during a 0° to 180° rotation at 4.8 V. This method further provides critical performance parameters, such as the time required for rotation over a range of angles and the rotational speed, offering insights into the motor's accuracy and effectiveness in performing precise actions under control. These parameters also facilitate a deeper understanding of the servo motor's behavior under different operating conditions, such as varying loads or voltage levels, making this method highly beneficial for optimizing servo motor usage in precision applications, including industrial automation and robotics.
[0061] FIG. 5A and FIG. 5B illustrate Y-direction assembly with dimensions and X-direction assembly with dimensions of the device, in accordance with an embodiment of the present disclosure. The present disclosure provides a LiDAR device assembly, wherein FIG. 5A illustrates the Y-Direction assembly with specified dimensions, and FIG. 5B illustrates the X-Direction assembly with corresponding dimensions. The figures depict critical measurements, including the rotational radii, component dimensions, and positional relationships between various elements, ensuring precise alignment and functional integration within the LiDAR device for optimal performance in spatial scanning applications.
[0062] LiDAR device 100 design that offers exceptional rotational flexibility, enabling 360° rotation along the x-axis and 180° rotation along the y-axis, thus providing the device with three degrees of freedom. This configuration enhances spatial coverage and data acquisition capabilities, allowing comprehensive 3D scanning of the environment and capturing detailed spatial information from multiple angles. The device integrates a compact and lightweight framework optimized for portability and ease of deployment, with PLA+ filaments of 1.75 mm diameter used for constructing the device, chosen for their superior strength and durability. The LiDAR device components are precisely configured using Fusion 360 software to ensure accuracy in meeting project specifications.
[0063] The structural elements configured for enhanced stability and durability, ensuring reliable performance in dynamic environments. The device incorporates modular components and user-friendly interfaces to facilitate quick setup and adjustments, increasing versatility across diverse applications. Additionally, the device is capable of integrating LiDAR point clouds with visual data from other imaging devices, aligning depth information with image planes to provide a unified view and enhance spatial understanding.
[0064] The proposed LiDAR device's operational approach is particularly suited to industrial applications such as robotics, where it supports autonomous navigation and obstacle detection by generating 3D maps, and automated driving, where it enhances advanced driver-assistance devices (ADAS) by integrating depth and visual data for improved vehicle operation. In surveillance applications, the device combines depth and visual information to enhance situational awareness, enabling accurate tracking of dynamic scenes. In industrial automation, facilitates precise measurements and quality control using 3D data for product consistency and process accuracy. Furthermore, the device's ability to scan and analyze complex structures, machinery, and dynamic environments makes it an invaluable tool for industries that rely on detailed environmental data and precise spatial information for operational efficiency and decision-making.
[0065] Thus, the present invention overcomes the drawbacks, shortcomings, and limitations associated with existing solutions, and provides a device that enhances depth perception by integrating LiDAR technology with 2D cameras, thereby offering a comprehensive three-dimensional understanding of the environment and addressing the limitations inherent in traditional two-dimensional sensors. Additionally, the present disclosure simplifies device setup and calibration through the use of a single, compact LiDAR sensor, reducing the complexity associated with stereo vision devices that typically require multiple cameras. Furthermore, the present disclosure ensures robust performance by leveraging the combined strengths of LiDAR technology and cameras, enabling reliable operation under varying lighting conditions and when encountering reflective surfaces. Moreover, the present disclosure offers a cost-effective solution by utilizing the affordable TF Luna Mini-S LiDAR sensor in conjunction with widely accessible tools such as Python and OpenCV, thereby making the device more economical in comparison to traditional LiDAR devices. The present disclosure also ensures real-time processing by employing efficient algorithms for data synchronization and processing, allowing for immediate feedback in critical applications such as autonomous navigation and real-time surveillance. Finally, the present disclosure provides a versatile device suitable for a wide range of applications, including robotics, automated driving, and surveillance, by integrating LiDAR technology and cameras to offer a holistic solution for enhanced environmental understanding.
[0066] It will be apparent to those skilled in the art that the device 100 of the disclosure may be provided using some or all of the mentioned features and components without departing from the scope of the present disclosure. While various embodiments of the present disclosure have been illustrated and described herein, it will be clear that the disclosure is not limited to these embodiments only. Numerous modifications, changes, variations, substitutions, and equivalents will be apparent to those skilled in the art, without departing from the spirit and scope of the disclosure, as described in the claims.

ADVANTAGES OF THE PRESENT INVENTION
[0067] The present disclosure provides a device that enhances depth perception by integrating LiDAR technology with 2D cameras, offering a comprehensive three-dimensional understanding of the environment and addressing the limitations of traditional two-dimensional sensors.
[0068] The present disclosure provides a device that simplifies device setup and calibration by utilizing a single, compact LiDAR sensor, thereby reducing the complexity associated with stereo vision devices that require multiple cameras.
[0069] The present disclosure provides a device that ensures robust performance by combining the strengths of LiDAR technology and cameras, enabling reliable operation in various lighting conditions and with reflective surfaces.
[0070] The present disclosure provides a device that offers a cost-effective solution by employing the affordable TF Luna Mini-S LiDAR sensor and widely accessible tools such as Python and OpenCV, making the device more economical compared to traditional LiDAR devices.
[0071] The present disclosure provides a device that ensures real-time processing by utilizing efficient algorithms for data synchronization and processing, allowing immediate feedback in applications such as autonomous navigation and real-time surveillance.
[0072] The present disclosure provides a device that offers versatility for a wide range of applications, including robotics, automated driving, and surveillance, by integrating LiDAR technology and cameras to deliver a holistic solution for enhanced environmental understanding.

, Claims:1. An inspection device (100) comprising:
a Y-direction stand (102-1) configured to stabilize a first sensor (104-1) during movement along a Y-axis;
an X-direction stand (102-2) configured to support movement of a second sensor (104-2) along an X-axis, wherein corresponding sensors emit laser pulses and measure time of flight of reflections to determine distances within a 2D plane;
a first servo motor (110-1) integrated with the Y-direction stand for driving vertical movement of the first sensor (104-1), enabling 180-degree rotation;
a second servo motor (110-2) integrated with the X-direction stand for driving horizontal movement of the second sensor, enabling 360-degree rotation;
a spur gear mechanism (112) for transmitting rotational motion from the corresponding servo motors to the respective stands; and
an extender (108) connecting the first and second sensors to a moving platform, allowing increased reach and flexibility for scanning operations, wherein the combination of 360-degree rotation along the x-axis and 180-degree rotation along the y-axis allows for three-dimensional data acquisition and enhanced spatial coverage.
2. The inspection device as claimed in claim 1, wherein the first sensor (104-1) and the second sensor (104-2) are light detection and ranging (LiDAR).
3. The inspection device as claimed in claim 1, wherein the first servo motor (110-1) for the Y-direction is a high-precision motor capable of operating at an average speed of approximately 353.77 degrees per second.
4. The inspection device as claimed in claim 1, wherein the second servo motor (110-2) for the X-direction is configured to enable rapid adjustments, obtaining a full 360-degree rotation in approximately 1.0188 seconds when powered at 4.8V.
5. The inspection device as claimed in claim 1, wherein the spur gear mechanism (112) is configured to minimize backlash and optimize torque transfer for motion transfer.
6. The inspection device as claimed in claim 1, wherein a protective top lid (114) configured to shield the corresponding sensors from environmental factors, ensuring reliable operation.
7. The inspection device as claimed in claim 1, wherein the Y-direction stand (102-1) and the X-direction stand (102-2) provide comprehensive spatial data capture capabilities.
8. The inspection device as claimed in claim 1, wherein an electronic case (116) configured to protect electronic components of the corresponding sensors from environmental damage.
9. The inspection device as claimed in claim 1, wherein the spur gear mechanism (112) comprises a driving gear that is operatively connected to the Y-direction stand, enabling 180° rotation about the x-axis in a y-plane.
10. The inspection device as claimed in claim 1, wherein the spur gear mechanism (112) comprises a driven gear that is operatively connected to the X-direction stand, enabling 360° rotation about the y-axis in a x-plane.

Documents

NameDate
202441083419-Proof of Right [09-11-2024(online)].pdf09/11/2024
202441083419-FORM-8 [08-11-2024(online)].pdf08/11/2024
202441083419-COMPLETE SPECIFICATION [30-10-2024(online)].pdf30/10/2024
202441083419-DECLARATION OF INVENTORSHIP (FORM 5) [30-10-2024(online)].pdf30/10/2024
202441083419-DRAWINGS [30-10-2024(online)].pdf30/10/2024
202441083419-EDUCATIONAL INSTITUTION(S) [30-10-2024(online)].pdf30/10/2024
202441083419-EVIDENCE FOR REGISTRATION UNDER SSI [30-10-2024(online)].pdf30/10/2024
202441083419-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [30-10-2024(online)].pdf30/10/2024
202441083419-FORM 1 [30-10-2024(online)].pdf30/10/2024
202441083419-FORM 18 [30-10-2024(online)].pdf30/10/2024
202441083419-FORM FOR SMALL ENTITY(FORM-28) [30-10-2024(online)].pdf30/10/2024
202441083419-FORM-9 [30-10-2024(online)].pdf30/10/2024
202441083419-POWER OF AUTHORITY [30-10-2024(online)].pdf30/10/2024
202441083419-REQUEST FOR EARLY PUBLICATION(FORM-9) [30-10-2024(online)].pdf30/10/2024
202441083419-REQUEST FOR EXAMINATION (FORM-18) [30-10-2024(online)].pdf30/10/2024

footer-service

By continuing past this page, you agree to our Terms of Service,Cookie PolicyPrivacy Policy  and  Refund Policy  © - Uber9 Business Process Services Private Limited. All rights reserved.

Uber9 Business Process Services Private Limited, CIN - U74900TN2014PTC098414, GSTIN - 33AABCU7650C1ZM, Registered Office Address - F-97, Newry Shreya Apartments Anna Nagar East, Chennai, Tamil Nadu 600102, India.

Please note that we are a facilitating platform enabling access to reliable professionals. We are not a law firm and do not provide legal services ourselves. The information on this website is for the purpose of knowledge only and should not be relied upon as legal advice or opinion.