Vakilsearch LogoIs NowZolvit Logo
close icon
image
image
user-login
Patent search/

System and method for user gait analysis

search

Patent Search in India

  • tick

    Extensive patent search conducted by a registered patent agent

  • tick

    Patent search done by experts in under 48hrs

₹999

₹399

Talk to expert

System and method for user gait analysis

ORDINARY APPLICATION

Published

date

Filed on 5 November 2024

Abstract

The present invention discloses a system and method for user gait analysis. The system (100) comprises a footwear-sensor assembly to house a sensor insole (104). The sensor insole (104) comprises at least one sensor assembly to detect foot pressure data and inertial data. An IoT enabled hardware unit (106) to collect the detected data. A data visualization and control unit (108) to receive the collected data from the IoT enabled hardware unit (106), process and visualize pressure distribution across various regions of the foot. A gait cloud server (112) to store, process, and analyze the foot pressure data and the inertial data received from the IoT enabled hardware unit (106). A video capturing unit (110) to capture visual data of the user gait. A gait video server (116) to synchronize the recorded visual data with the foot pressure data and the inertial data.

Patent Information

Application ID202441084729
Invention FieldBIO-MEDICAL ENGINEERING
Date of Application05/11/2024
Publication Number47/2024

Inventors

NameAddressCountryNationality
Babji SrinivasanDepartment of Applied Mechanics and Biomedical Engineering, Indian Institute of Technology Madras, Chennai, India 600036IndiaIndia
Rajagoplan SrinivasanDepartment of Chemical Engineering, Indian Institute of Technology Madras, Chennai, India 600036IndiaIndia
Syamkumar K SDepartment of Applied Mechanics and Biomedical Engineering, Indian Institute of Technology Madras, Chennai, India 600036IndiaIndia
A S KarthikeyanDepartment of Applied Mechanics and Biomedical Engineering, Indian Institute of Technology Madras, Chennai, India 600036IndiaIndia
Saravanan MDepartment of Applied Mechanics and Biomedical Engineering, Indian Institute of Technology Madras, Chennai, India 600036IndiaIndia

Applicants

NameAddressCountryNationality
Indian Institute of Technology Madras (IIT Madras)The Dean, Industrial Consultancy & Sponsored Research (IC&SR), Indian Institute of Technology Madras, Sardar Patel Road, IIT Post, Chennai, Tamil Nadu, India 600036IndiaIndia

Specification

Description:FIELD OF INVENTION
[001] The field of invention generally relates to motion analysis. More specifically, it relates to a system and method for user gait analysis.

BACKGROUND
[002] User Locomotion or Gait analysis is an important field in biomechanics, rehabilitation, and sports science, as it provides valuable insights into lower limb movement pattern of individuals. Further, an accurate assessment of gait dynamics is essential for diagnosing various medical conditions, tracking rehabilitation progress, and optimizing athletic performance.
[003] The importance of precise, real-time and offline gait analysis cannot be overstated, especially in environments like rehabilitation centers, sports facilities, and clinical settings, where the data can inform crucial decisions regarding treatment, recovery plans, and performance improvements.
[004] Existing systems for gait analysis fail to provide comprehensive, real-time and offline information that combines inertial measurement units (IMUs), force-sensitive resistors (FSRs), and video synchronization for detailed, synchronized motion tracking. They often lack accuracy in dynamic environments and are limited to static or controlled settings, leading to less reliable data.
[005] These limitations hinder effective rehabilitation programs and performance enhancements, especially in real-world conditions, where the dynamic nature of user movement needs precise tracking and analysis.
[006] Other existing systems that tried to address this problem are limited to either specific sensor types or offline data processing. For example, some systems use only the IMUs for motion tracking, while others rely solely on pressure sensors, failing to integrate multiple data sources. Additionally, video synchronization with sensor data for real-time feedback is often missing, reducing the accuracy of the gait analysis in dynamic environments.
[007] Thus, in light of the above discussion, it is implied that there is a need for a system and method for real-time and offline, dynamic gait analysis of users using a combination of IMUs, FSR sensors, and video synchronization.  
OBJECT OF INVENTION
[008] The principal object of this invention is to provide a system and method for user gait analysis.
[009] The principal object of this invention is to provide a system and method to perform real-time and offline gait analysis using multiple sensors for user motion tracking.
[0010] A further object of the invention is to provide a system and method for analyzing gait in dynamic environments using inertial and pressure sensors.
[0011] Another object of the invention is to provide a system and method to monitor lower limb biomechanics during movement.
[0012] Another object of the invention is to provide a system and method to capture and analyze micro-actions during gait.
[0013] Another object of the invention is to provide a system and method for accurate tracking of gait parameters in real-time and offline.
[0014] Another object of the invention is to provide a system and method to use sensor fusion to improve gait analysis accuracy.
[0015] Another object of the invention is to provide a system and method to measure and evaluate gait symmetry and balance.
[0016] Another object of the invention is to provide a system and method to assist in rehabilitation through dynamic range of motion monitoring.
[0017] Another object of the invention is to provide a system and method for tracking foot pressure and motion for podiatry applications.
[0018] Another object of the invention is to provide a system and method to detect irregular gait patterns in sports and physical activity.
[0019] Another object of the invention is to provide a system and method for gait analysis integrated with video synchronization for enhanced review.
[0020] Another object of the invention is to provide a system and method to compare gait metrics across multiple sessions for long-term monitoring.
[0021] Another object of the invention is to provide a system and method for offline analysis of recorded gait data.
[0022] Another object of the invention is to provide a system and method for wireless data transmission of gait analysis metrics.
[0023] Another object of the invention is to provide a system and method to integrate gait analysis with existing biomechanical assessment tools.
[0024] Another object of the invention is to provide a system and method to store and retrieve gait data in a secure and organized manner.
[0025] Another object of the invention is to provide a system and method to allow real-time feedback to the user based on gait metrics.
[0026] Another object of the invention is to provide a system and method to utilize cloud-based services for processing and storing large volumes of gait data.

BRIEF DESCRIPTION OF FIGURES
[0027] This invention is illustrated in the accompanying drawings, throughout which, like reference letters indicate corresponding parts in the various figures.
[0028] The embodiments herein will be better understood from the following description with reference to the drawings, in which:
[0029] Figure 1 depicts and environmental diagram for user gait analysis, in accordance with an embodiment of the present disclosure;
[0030] Figure 2 illustrates a system for user gait analysis, in accordance with an embodiment of the present disclosure;
[0031] Figure 3 depicts a sub-components of an IoT layer Module, in accordance with an embodiment of the present disclosure;
[0032] Figure 4 depicts the sub-components of a data visualization and control unit, in accordance with an embodiment of the present disclosure;
[0033] Figure 5 illustrates the sub-components of a gait cloud server, in accordance with an embodiment of the present disclosure;
[0034] Figure 6 depicts the sub-components of a user environment unit, in accordance with an embodiment of the present disclosure;
[0035] Figure 7 illustrates the sub-components of a gait video server, in accordance with an embodiment of the present disclosure;
[0036] Figure 8 depicts data analytics of mean pressure and peak pressure, in accordance with an embodiment of the present disclosure;
[0037] Figure 9 depicts the data analytics of step count, stride length and stride velocity, in accordance with an embodiment of the present disclosure;
[0038] Figure 10 depicts an exemplary embodiment of system's data analytics showing metrics, in accordance with an embodiment of the present disclosure;
[0039] Figure 11 illustrates a method for user gait analysis, in accordance with an embodiment of the present disclosure;
[0040] Figure 12 depicts method of acquiring data for gait analysis, in accordance with an embodiment of the present disclosure;
[0041] Figure 13 illustrates a method for estimating at least one inertial measurement sensor (IMU) parameters during gait analysis, in accordance with an embodiment of the present disclosure;
[0042] Figure 14 illustrates a method for viewing dynamic range of motion analysis, in accordance with an embodiment of the present disclosure; and
[0043] Figure 15 illustrates a method for estimating at least one pressure sensor parameter during gait analysis, in accordance with an embodiment of the present disclosure.



STATEMENT OF INVENTION
[0044] The present invention discloses a system and method for user gait analysis. The system comprises a footwear-sensor assembly worn by a user configured to house a sensor insole.
[0045] Further, the system comprises the sensor insole comprising at least one sensor assembly to detect foot pressure data and inertial data.
[0046] Furthermore, the system comprises an IoT enabled hardware unit configured to collect the detected foot pressure data and the inertial data from the sensor insole.
[0047] Subsequently, the system comprises a data visualization and control unit configured to receive the collected foot pressure data and the inertial data from the IoT enabled hardware unit, process and visualize pressure distribution across various regions of the foot and provide real-time feedback to the user for gait analysis.
[0048] Additionally, the system comprises a gait cloud server configured to store, process, and analyze the foot pressure data and the inertial data received from the IoT enabled hardware unit.
[0049] Furthermore, the system comprises a user environment unit comprising a video capturing unit configured to capture visual data of the user gait.
[0050] Thereafter, the system comprises a gait video server configured to synchronize and store the recorded visual data received from the gait cloud server with the foot pressure data and the inertial data, enabling identification and tracking of micro-actions related to specific phases of the user gait.

DETAILED DESCRIPTION
[0051] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and/or detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[0052] The present invention discloses a system and method for user gait analysis that tracks and analyzes lower limb biomechanics in real-time or offline. The system comprises a microcontroller unit (MCU), inertial measurement units (IMUs), force-sensitive resistors (FSRs), and a rechargeable battery to capture detailed movement data. The system integrates video synchronization and novel algorithms for precise gait assessment, making it highly applicable in fields like rehabilitation, sports science, podiatry, and clinical diagnosis. The present invention offers improved accuracy in analyzing dynamic user motion for various health and performance applications.
[0053] Figure 1 depicts an environmental diagram for user gait analysis.
[0054] The environmental diagram comprises at least one of a footwear-sensor assembly 102, a sensor insole 104, an Internet of things (IoT) enabled hardware unit 106, data visualization and control unit 108 and a video capturing unit 110.
[0055] In an embodiment, the at least one footwear-sensor assembly 102 is configured to have at least one of a complete footwear. The footwear-sensor assembly to house a sensor insole 104. The sensor insole 104 comprises at least one sensor assembly.
[0056] In an embodiment, the at least one footwear-sensor assembly 102 may comprise a combination of the sensor assembly, the IoT enabled hardware unit with the rechargeable battery, that are embedded on the footwear itself. In other embodiment, the at least one footwear-sensor assembly 102 may comprise the sensor assembly embedded on the footwear itself, but the IoT enabled hardware unit with the rechargeable battery connected separately. Furthermore, in another embodiment the at least one footwear-sensor assembly 102 may comprise the insole that embedded both the sensor assembly, the IoT enabled hardware unit with the rechargeable battery on itself and that the insole placed/housed within the footwear.
[0057] In another embodiment, the at least one footwear-sensor assembly 102 with the insole that is embedded the sensor assembly, that is placed/housed within the footwear the IoT enabled hardware unit with the rechargeable battery attached separately.
[0058] In an embodiment, the footwear may be at least one of a shoe, a boot, a sandal, a slipper, a sports shoe.
[0059] In an embodiment, the sport shoe may be at least one of a boxing shoe, a golf shoe, and a cricket shoe.
[0060] The sensor insole 104 comprises the at least one sensor assembly to detect foot pressure data and inertial data.
[0061] In an embodiment, the at least one of a footwear-sensor assembly 102 allows for the continuous monitoring of foot movements and pressure during various activities, ensuring comfort and stability for the user while housing sensors.
[0062] In an embodiment, the sensor insole 104 comprises at least one of at least one pressure sensor, at least one movement analysis sensors, at least one temperature sensor, at least one humidity sensor, at least one foot elevation variation sensor and at least one feedback actuator. In an embodiment, the at least one movement analysis sensor comprises at least one of an inertial measurement sensor (IMU).
[0063] In an embodiment, the at least one feedback actuator comprises at least one of a haptic and vibration actuator.
[0064] In an embodiment, the sensor insole 104 may also comprise at least one sensor such as global positioning system (GPS).
[0065] In an embodiment, the at least one pressure sensor may be a Force Sensitive Resistor (FSR). In an embodiment, the sensor insole 104 comprises a network of the force-sensitive resistors (FSRs) embedded within the sole of the shoe, designed to detect pressure distribution across different regions of the foot.
[0066] In an embodiment, the at least one pressure sensor is configured to detect foot pressure data.
[0067] In an embodiment, the at least one inertial measurement sensor is configured to detect inertial data.
[0068] In an embodiment, the sensor insole 104 is configured for capturing real-time and offline foot pressure data during gait. The sensor insole 104 collects critical information about how different regions of the foot interact with the ground, crucial for analyzing gait patterns and movement dynamics.
[0069] In an embodiment, the Internet of things (IoT) enabled hardware unit 106 comprises a microcontroller unit (MCU) 204, a rechargeable battery 214, charging port such as a type C USB port and a micro-SD card 216.
[0070] In an embodiment, the IoT enabled hardware unit 106 configured to collect the detected foot pressure data and the inertial data from the sensor insole 104.
[0071] In an embodiment, the IoT enabled hardware unit 106 is configured for collecting, processing, and transmitting the foot pressure data and the inertial data, to the data visualization and control unit 108.
[0072] In an embodiment, the IoT enabled hardware unit 106 is powered by the rechargeable battery 214 and stores data permanently using micro-SD card 216.
[0073] In an embodiment, the data visualization and control unit 108 comprise mobile device, laptop, or desktop computer.
[0074] In an embodiment, the data visualization and control unit 108 configured to receive the collected foot pressure data and the inertial data from the IoT enabled hardware unit 106, process and visualize pressure distribution across various regions of the foot and provide real-time feedback to the user for gait analysis.
[0075] In an embodiment, the data visualization and control unit 108 comprise a cross-platform front-end application for real-time interaction with external entities.
[0076] In an embodiment, the video capturing unit 110 comprises at least one camera.
[0077] In an embodiment, the video capturing unit 110 comprises a high-definition camera integrated with the system 100 for recording dynamic motion sequences during the user gait.
[0078] In an embodiment, the video capturing unit 110 is configured to capture visual data of the user gait.
[0079] In an embodiment, the visual data may be a video and an image.
[0080] Figure 2 illustrates a system 100 for user gait analysis.
[0081] In an embodiment, the system 100 comprises the sensor insole 104, the IoT enabled hardware unit 106, the data visualization and control unit 108, a gait cloud server 112, a user environment unit 114 and a gait video server 116.
[0082] In an embodiment, the sensor insole 104 is customizable in size and configuration, and the number of the at least one pressure sensor can be adjusted based on the user and application. The sensor insole 104 with a grid pattern of the at least one pressure sensor is strategically positioned to maximize accurate plantar pressure measurements and estimate Centre of Pressure (COP) and its trajectory, based on the size of the sensor insole 104.
[0083] In an embodiment, the sensor insole 104 is designed as a versatile, customizable feature compatible with various shoe form factors and sizes, ranging from standardized sizes (e.g., 3 to 11). This adaptability allows it to be used in multiple shoe instances, such as regular shoes, sports shoes, and custom-made footwear. For regular shoes, the sensor insole 104 is inserted, and the IoT enabled hardware unit 106 is attached to the lower part of the leg. Similarly, in sports shoes, the insole is integrated, and the IoT enabled hardware unit 106 is mounted on the leg, providing flexibility across sports like boxing, golf, and cricket. In custom shoes, the sensor insole 104 is embedded, and the IoT enabled hardware unit 106 is attached directly to the outer part of the shoe, eliminating the need for leg mounting. Additionally, a custom-made slipper model integrates both the sensor insole 104 and the IoT enabled hardware unit 106 with a built-in battery, making it ideal for rehabilitation and medical applications like diabetic foot ulcer detection. The number of force-sensitive resistors (FSRs) in the insole can also be customized based on the application, with options ranging from 4 to 17 sensors.
[0084] In an embodiment, the at least one pressure sensor is configured to detect foot pressure data. The foot pressure data comprises at least one of a mean pressure, peak pressure, toe and heel pressure, and location of center of pressure (COP).
[0085] In an embodiment, the mean pressure may be an average force exerted by the foot on the ground across all the sensors during a given gait cycle. The mean pressure is calculated by taking the average of all the sensor values over time. This metric helps identify the overall distribution of pressure on the foot, which can provide insights into gait efficiency, foot mechanics, and load distribution.
[0086] In an embodiment, the peak pressure refers to the maximum pressure recorded at any sensor during the gait cycle. The peak pressure highlights points of highest impact or force exertion on specific areas of the foot. High peak pressure in certain regions may indicate areas of potential injury, abnormal gait patterns, or uneven weight distribution.
[0087] In an embodiment, the toe pressure and heel pressure refer to the forces exerted specifically by the toes and heel, respectively, as measured by the sensors placed in those regions of the sensor insole 104. The toe pressure and heel pressure are critical for understanding the foot's rolling motion during walking or running, the timing of toe-off, and heel-strike events. Variations in toe and heel pressure can indicate issues with gait, such as overpronation or supination.
[0088] In an embodiment, the center of pressure (COP) represents the point on the foot where the overall pressure is concentrated during movement. The COP is calculated using the weighted average of all the at least one pressure sensor's values, providing a spatial location on the insole. The COP trajectory is a dynamic measurement, showing how the pressure moves across the foot throughout a gait cycle. This data is crucial for analyzing balance, stability, and weight-shifting behavior during gait.
[0089] In an embodiment, the gait cloud server 112 comprises at least one of a gait database 402, a backend services module 404 and a backend logic module 406.
[0090] In an embodiment, the gait cloud server 112 is configured to store, process, and analyze the foot pressure data and the inertial data received from the IoT enabled hardware unit 106.
[0091] Furthermore, the at least one inertial measurement sensor (IMU) is configured to sense the inertial data. The inertial data comprises instantaneous position, rotation angle, stride length, lower limb rotation, movement speed, and stride velocity.
[0092] In an embodiment, the instantaneous position refers to the real-time spatial location of the foot or lower limb during motion. The instantaneous position data is continuously tracked to understand the movement trajectory of the foot at any given moment in time.
[0093] In an embodiment, the rotation angle refers to the angular movement or orientation of the foot or lower limb around a specific axis, measured in degrees. The rotation angle captures how the foot or leg rotates during activities like walking, running, or other dynamic motions.
[0094] In an embodiment, the stride length refers to the distance between successive placements of the same foot (e.g., the distance from the right foot touching the ground to the next time the right foot touches the ground). The stride length is a key indicator of walking or running efficiency.
[0095] In an embodiment, the lower limb rotation refers to the rotational motion of the entire lower limb (hip, knee, ankle) as it moves during a step or stride. The lower limb rotation helps in understanding how the leg is twisting or turning during movement.
[0096] In an embodiment, the movement speed refers to the overall speed at which the user is moving. The movement speed takes into account the total distance covered over time, providing insight into how fast a person walks or runs.
[0097] In an embodiment, the stride velocity refers to the speed at which each individual stride is completed. The stride velocity reflects how quickly a person is moving from one footfall to the next, offering a more granular look at walking or running pace.
[0098] In an embodiment, the IMU sensor is a 6-axis IMU.
[0099] The IMU sensor comprises a high-performance accelerometer and a gyroscope.
[00100] The high-performance accelerometer is configured to capture linear acceleration along three coordinate axes.
[00101] The gyroscope is configured to measure angular velocity around three coordinate axes.

[00102] In an embodiment, the user environment unit 114 comprises the video capturing unit 110 and the shoe.
[00103] In an embodiment, the user environment unit 114 is configured to capture and transmit data from the user during gait activities, enabling real-time or offline analysis through synchronization with the IoT enabled hardware unit 106 and the video capturing unit 110.
[00104] In an embodiment, the gait video server 116 comprises captured correlated video module 408.
[00105] In an embodiment, the gait video server 116 is configured to synchronize and store the recorded visual data received from the gait cloud server 112 with the foot pressure data and the inertial data, enabling identification and tracking of micro-actions related to specific phases of the user gait.
[00106] In an embodiment, the gait video server 116 is configured to store, organize, and synchronize the recorded visual data with corresponding sensor data, enabling frame-by-frame analysis of lower-limb micro-actions in real-time or post-processed analysis sessions.
[00107] In an embodiment, the micro actions related to specific phases of the user gait comprise swing, stance phases in normal walking, weight transfer, balancing, foot movement, rotation and timing.
[00108] Figure 3 depicts a sub-components of an IoT layer Module 202.
[00109] In an embodiment, the IoT layer Module 202 comprises the IoT enabled hardware unit 106 and the sensor insole 104.
[00110] In an embodiment, the IoT layer Module 202 is configured to interface with the hardware components for communication and data transmission to cloud over a wireless network
[00111] In an embodiment, the IoT enabled hardware unit 106 comprises the microcontroller unit (MCU) 204, the rechargeable battery 214 and the micro-SD card 216.
[00112] In an embodiment, the IoT enabled hardware unit 106 is configured to collect sensor data from the insole, store it locally on the micro-SD card 216, and transmit the data wirelessly using the Bluetooth low energy (BLE) stack for further processing.
[00113] In an embodiment, the microcontroller unit 204 comprises a Bluetooth low energy stack unit 206, an on-board inertial measurement unit 208, a battery charge controller unit 210, and an analog digital converter 212.
[00114] In an embodiment, the microcontroller unit 204 is configured to process the foot pressure data and the inertial data from the sensor insole 104, perform initial signal conditioning, and transmit the processed data wirelessly to the data visualization and control unit 108.
[00115] In an embodiment, the microcontroller unit (MCU) 204 is configured to enhance the accuracy of position, distance, velocity, and angle estimations at the micro-actions level through implementation of sensor fusion techniques, comprising at least one of complementary filtering and an attitude and heading reference system (AHRS).
[00116] In an embodiment, complimentary filtering is used to fuse data from multiple sensors to estimate a more accurate output by combining the strengths of each sensor while minimizing their weaknesses.
[00117] The complementary filtering may be used to combine data from the accelerometer and gyroscope, to provide accurate and stable real-time or offline information on the foot or leg's position and orientation.
[00118] Further, the attitude and heading reference system (AHRS) may be configured to determine the lower limb's complete motion dynamics such as tilt, rotation, and orientation during each step or stride using the sensor assembly data.
[00119] In an embodiment, the rechargeable battery 214 is configured to supply consistent power to the IoT layer Module 202.
[00120] In an embodiment, the micro-SD card 216 is configured to permanently store both the foot pressure data and the inertial data for later post-processing, redundancy, backup, and offline data retrieval
[00121] In an embodiment, the Bluetooth low energy stack unit 206 comprises a wireless communication interface supporting BLE protocols for low-power data transmission.
[00122] In an embodiment, the Bluetooth low energy stack unit 206 is configured to establish a low-power wireless connection with nearby devices, such as smartphones or control units, for data transfer.
[00123] In an embodiment, the Bluetooth low energy (BLE) stack unit 206 is configured to facilitate wireless communication between the IoT layer Module 202 and the data visualization and control unit 108.
[00124] In an embodiment, the on-board inertial measurement unit 208 is configured to monitor and record the user's movement dynamics.
[00125] The on-board inertial measurement unit 208 is configured to capture the acceleration, orientation and angular velocity of the foot.
[00126] In an embodiment, the battery charge controller unit 210 comprises a power management circuit that controls the charging and discharging cycles of the rechargeable battery 214.
[00127] In an embodiment, the battery charge controller unit 210 is configured to ensure safe and efficient power management, including overcharge protection and energy optimization during device operation.
[00128] In an embodiment, the battery charge controller unit 210 is configured to manage charging of the rechargeable battery 214.
[00129] In an embodiment, the analog digital converter 212 comprises circuitry to convert analog signals from the at least one pressure sensor into digital data for processing.
[00130] In an embodiment, the analog digital converter 212 is configured to accurately digitize sensor data such as force-sensitive resistor readings for processing and analysis by the microcontroller.
[00131] Figure 4 depicts the sub-components of a data visualization and control unit 108.
[00132] In an embodiment, the data visualization and control unit 108 comprise a Bluetooth low energy stack module 302 and a front end application module 304.
[00133] In an embodiment, the Bluetooth low energy stack module 302 is configured to facilitate wireless communication between the IoT-enabled hardware unit and the data visualization and control unit 108, enabling the transmission of the foot pressure data and the inertial data for real-time and offline analysis.
[00134] In an embodiment, the front end application module 304 is configured to provide an interface for visualizing data, managing hardware devices, and interacting with real-time or recorded gait analysis results.
[00135] In an embodiment, the front end application module 304 comprises a cache module 306, an application logic and services module 308, and an user interface module 310.
[00136] In an embodiment, the cache module 306 is configured to o temporarily store data for quick access and smooth user interaction during real-time analysis or playback of gait metrics.
[00137] In an embodiment, the application logic and services module 308 are configured to process data from the sensor insole 104 and IoT enabled hardware unit 106, performing functions such as filtering, data calculation, and managing communication between the control unit and the gait cloud server 112.
[00138] In an embodiment, the user interface module 310 is configured to display visualized foot pressure distribution, inertial data, and real-time feedback, while providing controls for device management and session analysis.
[00139] Figure 5 illustrates the sub-components of a gait cloud server 112.
[00140] In an embodiment, the gait cloud server 112 comprises a gait database 402, a backend services module 404 and a backend logic module 406.
[00141] In an embodiment, the gait database 402 is configured to store historical gait data, including foot pressure metrics, inertial data, and synchronized video footage for future analysis.
[00142] In an embodiment, the backend services module 404 is configured to handle the storage, retrieval, and processing of gait analysis data, supporting real-time and offline mode synchronization between the cloud server and the data visualization unit.
[00143] In an embodiment, the backend logic module 406 is configured to execute algorithms for gait analysis, including metrics estimation and synchronization of sensor data with video for accurate identification of micro-actions.
[00144] In an embodiment, the backend logic module 406 is configured to synchronize video metadata with sensor data and provide processing capabilities for real-time and post-session analysis of dynamic range of motion.
[00145] Figure 6 depicts the sub-components of a user environment unit 114.
[00146] In an embodiment, the user environment unit 114 comprises a video capturing unit 110 and the footwear-sensor assembly 102
[00147] In an embodiment, the video capturing unit 110 is configured to capture synchronized video footage of the user's movement during gait analysis, allowing for real-time or post-session review of biomechanical actions. The video data is correlated with sensor data, providing frame-by-frame alignment with foot pressure and inertial measurements for detailed movement analysis.
[00148] In an embodiment, the footwear-sensor assembly 102 comprises the sensor insole 104.
[00149] Figure 7 illustrates the sub-components of a gait video server 116.
[00150] In an embodiment, the gait video server 116 comprises captured correlated video module 408. In an embodiment, the gait video server 116 is configured to manage the storage and retrieval of the recorded visual data, ensuring that the recorded visual data is synchronized with the foot pressure data and the inertial data for accurate analysis. The gait video server 116 processes the recorded visual data to provide analytics related to gait performance and allows users to access and review recorded sessions through a user-friendly interface.
[00151] In an embodiment, the captured correlated video module 408 is configured to synchronize and display the recorded visual data alongside corresponding the foot pressure data and the inertial data, allowing for the micro-actions analysis in real-time or offline.
[00152] Figure 8 depicts data analytics of mean pressure and peak pressure.
[00153] Figure 8 shows a graphical representation mean pressure and peak pressure of left insole and right insole distributed over a period of time. The bar graph illustrates pressure versus time, comparing the left and right insole mean pressures calculated by averaging the pressure sensor values for left and right insole respectively. Additionally, another bar graph represents pressure versus time for left and right insole peak pressures calculated by determining the maximum pressure value among the pressure sensor values for left and right insole respectively. The comparison of left and right insole pressure values provides insights on the pressure related gait asymmetries whereas comparison among the same insole provides pressure related gait variabilities.
[00154] Figure 9 depicts the data analytics of step count, stride length and stride velocity.
[00155] Figure 9 shows the data analytics of IMU sensor -derived metrics for the user's gait analysis such as step count, stride length and stride velocity. The step count is measured by taking total number of steps taken so far at that instant and the difference among left and right step count provides insights on gait asymmetries. The stride length is derived from IMU data represents distance covered per stride for left and right foot, is a key indicator for gait efficiency and comparison of left and right shows possible existence of asymmetries if any. The stride velocity derived from IMU data shows speed at which each stride is taken and its asymmetry among left and right indicates presence of fatigue.
[00156] Figure 10 depicts an exemplary embodiment of system's data analytics showing metrics.
[00157] Figure 10 depicts data analytics measured for gait analysis of the user. Furthermore, the figure 10 depicts user interface displaying the measured foot metrics like mean pressure and peak pressure, center of pressure (COP) and IMU metrics. The COP shows pressure distribution across left and right foot, with marked points representing average foot pressure center. The COP is derived by taking weighted average of the pressure sensor values with special locations in the pressure sensor. The toe and heel pressure values are derived from the pressure sensors located at the toe and heel respectively. The toe heel transitions, COP trajectory are critical for assessing balance, stability, and asymmetries. Furthermore, the figure 10 depicts data analytics of measuring insole toe and heel, insole mean pressure, insole peak pressure and insole stride length for the accurate gait analysis.
[00158] Figure 11 illustrates a method 1100 for user gait analysis. The method begins with housing a sensor insole 104 in a footwear-sensor assembly 102 worn by a user, as depicted at step 1102. Subsequently, the method 1100 discloses providing the sensor insole comprising at least one sensor assembly to detect foot pressure data and inertial data, as depicted at step 1104. Additionally, the method 1100 discloses collecting the detected foot pressure data and the inertial data, by a IoT enabled hardware unit 106, from the sensor insole 104, as depicted at step 1106. Furthermore, the method 1100 discloses receiving the collected foot pressure data and the inertial data, by a data visualization and control unit 108, from the IoT enabled hardware unit 106, processing and visualizing pressure distribution across various regions of the foot and providing real-time feedback to the user for gait analysis, as depicted at step 1108. Subsequently, the method 1100 discloses storing, processing, and analyzing, by a gait cloud server 112, the foot pressure data and the inertial data received from the IoT enabled hardware unit 106, as depicted at step 1110. Additionally, the method 1100 discloses providing a user environment unit 114 comprising a video capturing unit 110 for capturing visual data of the user gait, as depicted at step 1112. Thereafter, the method 1100 discloses synchronizing and storing the recorded visual data, by a gait video server 116, received from the gait cloud server 112 with the foot pressure data and the inertial data, enabling identification and tracking of micro-actions related to specific phases of the user gait, as depicted at step 1114.
[00159] Figure 12 depicts method 1200 of acquiring data for gait analysis.
[00160] The method begins with selecting a user, as depicted at step 1202. Subsequently, the method 1200 discloses selecting a IoT enabled hardware unit 106 and sensor insole 104 for data acquisition, as depicted at step 1204. Additionally, the method 1200 discloses scanning and connecting the IoT enabled hardware unit 106 using a Bluetooth low energy (BLE) stack unit 206, as depicted at step 1206. Furthermore, the method 1200 discloses selecting a data acquisition mode, as depicted at step 1208. Subsequently, the method 1200 discloses starting the data acquisition and video capturing, as depicted at step 1210. Additionally, the method 1200 discloses stopping the data acquisition and video capturing when completed, as depicted at step 1212. Thereafter, the method 1200 discloses uploading the recorded visual data from a micro-SD card 216 to a gait cloud server 112 and the recorded visual data to a gait video server 116 in offline mode, wherein the data acquisition mode comprises near real-time and offline modes, wherein the near real-time mode involves wireless transfer of FSR and IMU data through the BLE stack for real-time analysis, wherein the offline mode involves storing FSR and IMU data on the micro-SD card 216, with video captured for later synchronization and analysis, as depicted at step 1214.
[00161] Figure 13 illustrates a method 1300 for estimating at least one inertial measurement sensor (IMU) parameters during gait analysis. The method begins with extracting and applying a low-pass filter to accelerometer and gyroscope data of a sub-session window from a data session, as depicted at step 1302. Subsequently, the method 1300 discloses detecting step counts based on acceleration data when above a pre-configured threshold, as depicted at step 1304. Additionally, the method 1300 discloses estimating angle information using the gyroscope and the accelerometer data through sensor fusion and a complementary filter, as depicted at step 1306. Thereafter, the method 1300 discloses estimating stride length, stride velocity, variability, and asymmetry using an AHRS (Attitude and Heading Reference System) technique, as depicted at step 1308.
[00162] Figure 14 illustrates a method 1400 for viewing dynamic range of motion analysis. The method begins with updating the uploaded recorded visual data stored in the gait video server 116 for the data session, as depicted at step 1402. Subsequently, the method 1400 discloses synchronizing the recorded visual data with the data from metadata, as depicted at step 1404. Additionally, the method 1400 discloses creating a sub-session window of the data session, as depicted at step 1406. Furthermore, the method 1400 discloses estimating foot pressure data for the sub-session window, as depicted at step 1408. Subsequently, the method 1400 discloses estimating inertial data for the sub-session window, as depicted at step 1410. Thereafter, the method 1400 discloses displaying the estimated foot pressure data and the inertial data along with the corresponding synchronized recorded visual data for the sub-session window, as depicted at step 1412.
[00163] Figure 15 illustrates a method 1500 for estimating at least one pressure sensor parameter during gait analysis. The method begins with extracting foot pressure data of a sub-session window from the data session, as depicted at step 1502. Subsequently, the method 1500 discloses estimating toe and heel data by averaging the sensor values located at the toe and heel respectively, as depicted at step 1504. Additionally, the method 1500 discloses estimating the mean pressure by calculating the average of all at least one pressure sensor values, as depicted at step 1506. Thereafter, the method 1500 discloses estimating the center of pressure (COP) using weighted averages of the at least one pressure sensor values and their corresponding spatial locations on the sensor insole 104, as depicted at step 1508.
[00164] The advantages of the current invention include real-time, precise gait analysis in a dynamic range of motion using multiple sensors, allowing for highly accurate data capture of user biomechanics.
[00165] An additional advantage is that the system provides synchronized video and sensor data, offering enhanced visual analysis alongside quantitative measurements.
[00166] An additional advantage is that the device is compact and portable, making it suitable for use in clinical settings, sports environments, and home rehabilitation.
[00167] An additional advantage is that the system can be used for both real-time and offline analysis, enabling flexibility in data review and interpretation.
[00168] An additional advantage is that the sensor integration allows for 3D motion tracking, offering detailed insights into lower limb dynamics.
[00169] An additional advantage is that the system uses force-sensitive resistors (FSRs) to capture ground reaction forces, which adds an additional layer of depth to gait analysis.
[00170] An additional advantage is that the algorithms developed for this system can detect and quantify micro-movements, making it highly sensitive to subtle changes in gait.
[00171] An additional advantage is that the invention is highly adaptable and can be used in different environments, including clinical rehabilitation, athletic training, and injury prevention programs.
[00172] An additional advantage is that the system's use of both inertial measurement units (IMUs) and FSR sensors provides a comprehensive biomechanical profile of the user.
[00173] An additional advantage is that the system's data can be wirelessly transmitted for remote analysis, enabling telehealth applications in rehabilitation and sports science.
[00174] An additional advantage is that the system can alert users to potential gait abnormalities in real time, allowing for immediate corrective actions.
[00175] An additional advantage is that the algorithms can be customized to different user needs, such as adapting to specific sports or medical conditions.
[00176] An additional advantage is that the integration of video analysis enhances user engagement and understanding, making it easier for non-experts to interpret results.
[00177] An additional advantage is that the system supports a wide range of gait-related research, enabling data collection for academic studies and clinical trials.
[00178] An additional advantage is that the invention is scalable, allowing for potential future integration with other health-monitoring systems or wearables.
[00179] An additional advantage is that the system can be used to evaluate different types of gaits, including walking, running, and specialized movements in sports.
[00180] Applications of the current invention include rehabilitation, sports science, clinical diagnosis of gait abnormalities, elderly fall prevention, podiatry, orthopedics, biomechanical research, injury prevention, telehealth and remote healthcare, wearable technology, and military and defense.
[00181] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the scope of the embodiments as described here.
, Claims:We claim:

1. A system for user gait analysis, the system comprising:
a footwear-sensor assembly (102) worn by a user configured to house a sensor insole (104);
the sensor insole (104) comprising at least one sensor assembly to detect foot pressure data and inertial data; ;
a IoT enabled hardware unit (106) configured to collect the detected foot pressure data and the inertial data from the sensor insole (104);
a data visualization and control unit (108) configured to receive the collected foot pressure data and the inertial data from the IoT enabled hardware unit (106), process and visualize pressure distribution across various regions of the foot, and provide real-time feedback to the user for gait analysis;
a gait cloud server (112) configured to store, process, and analyze the foot pressure data and the inertial data received from the IoT enabled hardware unit (106);
a user environment unit (114) comprising a video capturing unit (110) configured to capture visual data of the user gait placed at a distance of less than 10 meters such; and
a gait video server (116) configured to synchronize and store the recorded visual data received from the gait cloud server (112) with the foot pressure data and the inertial data, enabling identification and tracking of micro-actions related to specific phases of the user gait.

2. The system as claimed in claim 1, wherein the at least one sensor assembly comprises at least one pressure sensor, at least one movement analysis sensor, at least one temperature sensor, at least one humidity sensor, at least one foot elevation variation measurement sensor and at least one feedback actuator, wherein the at least one pressure sensor comprises Force Sensitive Resistor (FSR), wherein the at least one movement analysis sensor comprises at least one of an inertial measurement sensor (IMU), wherein the at least one feedback actuator comprises at least one of a haptic and vibration actuator.

3. The system as claimed in claim 1, wherein the sensor insole (104) is customizable in size and configuration, and a number of the at least one pressure sensor is adjusted based on the user and application, and
wherein the sensor insole (104) with a grid pattern of at least one pressure sensor is strategically positioned to maximize accurate plantar pressure measurements and estimate the Centre of Pressure (COP) and its trajectory, based on the size of the sensor insole (104).

4. The system as claimed in claim 1, wherein the foot pressure data comprises at least one of a mean pressure, peak pressure, toe and heel pressure, and location of center of pressure (COP).

5. The system as claimed in claim 1, wherein the inertial data comprises instantaneous position, rotation angle, stride length, lower limb rotation, movement speed, and stride velocity.

6. The system as claimed in claim 1, wherein the micro actions related to specific phases of the user gait comprise swing, stance phases in normal walking, weight transfer, balancing, foot movement, rotation and timing.

7. The system as claimed in claim 1, wherein the IMU sensor is a 6-axis IMU, the IMU sensor comprises:
a high-performance accelerometer configured to capture linear acceleration along three coordinate axes; and
a gyroscope configured to measure angular velocity around three coordinate axes.

8. The system as claimed in claim 1, wherein the IoT enabled hardware unit (106) is embedded in an IoT layer Module (202),
wherein the IoT enabled hardware unit (106) comprises:
a microcontroller unit (MCU) (204) configured to process the foot pressure data and the inertial data from the sensor insole (104), perform initial signal conditioning, and transmit the processed data wirelessly to the data visualization and control unit (108);
a rechargeable battery (214) is configured to supply consistent power to the IoT layer Module (202); and
a micro-SD card (216) is configured to permanently store both the foot pressure data and the inertial data for later post-processing, redundancy, backup, and offline data retrieval.

9. The system as claimed in claim 7, wherein the microcontroller unit (MCU) (204) is configured to enhance the accuracy of position, distance, velocity, and angle estimations at the micro-actions level through implementation of sensor fusion techniques, comprising at least one of complementary filtering and an attitude and heading reference system (AHRS).

10. The system as claimed in claim 7, wherein the microcontroller unit (MCU) (204) comprises:
a Bluetooth low energy (BLE) stack unit (206) is configured to facilitate wireless communication between the IoT layer Module (202) and the data visualization and control unit (108);
an on-board inertial measurement unit (208) is configured to capture the acceleration, orientation and angular velocity of the foot; and
a battery charge controller unit (210) is configured to manage charging of the rechargeable battery (214).

11. The system as claimed in claim 1, wherein the data visualization and control unit (108) comprise a mobile device, laptop, or desktop computer, wherein the data visualization and control unit (108) comprise a cross-platform front-end application for real-time interaction with the gait cloud server (112).

12. The system as claimed in claim 1, wherein the data visualization and control unit (108) comprise:
a Bluetooth low energy (BLE) stack module is configured to facilitate wireless communication between the data visualization and control unit (108) and the IoT-enabled hardware unit;
a front-end application module is configured to present real-time and historical gait data to the user in a user-friendly interface, wherein front end application module (304) comprises:
a cache module (306) is configured to temporarily store the foot pressure and the inertial data for faster access during real-time visualization and ensure smooth operation even in low connectivity conditions;
an application logic and services module (308) are configured to handle data processing, user interaction, and interface controls, and linking the backend services; and
a user interface module (310) is configured to facilitate real-time interaction with pressure distribution visualizations across various regions of the foot based on data received from the IoT layer.

13. The system as claimed in claim 1, wherein the gait cloud server (112) comprises:
a gait database (402) is configured to store the recorded visual data, the foot pressure data and the inertial data;
a backend services module (404) is configured to provide communication between the front-end application, the gait cloud server (112), and the gait database (402), facilitating data transfer, storage, and retrieval operations; and
a backend logic module (406) is configured to perform data analytics, comprising gait pattern recognition, abnormality detection, and personalized feedback generation based on the user's data.

14. The system as claimed in claim 1, wherein the gait video server (116) comprises a captured correlated video module (408) configured to synchronize and display the recorded visual data alongside corresponding the foot pressure data and the inertial data, allowing for the micro-actions analysis in real-time or offline.

15. A method for user gait analysis, the method comprising:
housing a sensor insole (104) in a footwear-sensor assembly (102) worn by a user;
providing the sensor insole (104) comprising at least one sensor assembly to detect foot pressure data and inertial data;
collecting the detected foot pressure data and the inertial data, by a IoT enabled hardware unit (106), from the sensor insole (104);
receiving the collected foot pressure data and the inertial data, by a data visualization and control unit (108), from the IoT enabled hardware unit (106), processing and visualizing pressure distribution across various regions of the foot, and providing real-time feedback to the user for gait analysis;
storing, processing, and analyzing, by a gait cloud server (112), the foot pressure data and the inertial data received from the IoT enabled hardware unit (106);
providing a user environment unit (114) comprising a video capturing unit (110) for capturing visual data of the user gait placed at a distance of less than 10 meters; and
synchronizing and storing the recorded visual data, by a gait video server (116), received from the gait cloud server (112) with the foot pressure data and the inertial data, enabling identification and tracking of micro-actions related to specific phases of the user gait.

16. The method as claimed in claim 15, comprising providing the at least one sensor assembly comprises at least one pressure sensor, at least one movement analysis sensor, at least one temperature sensor, at least one humidity sensor, at least one foot elevation variation measurement sensor and at least one feedback actuator, wherein the at least one pressure sensor comprises Force Sensitive Resistor (FSR), wherein the at least one movement analysis sensor comprises at least one of an inertial measurement sensor (IMU), wherein the at least one feedback actuator comprises at least one of a haptic and vibration actuator.

17. The method as claimed in claim 15, comprising providing the sensor insole (104) is customizable in size and configuration, and a number of the at least one pressure sensor can be adjusted based on the user and application, and
strategically positioning, by the sensor insole (104) with a grid pattern of at least one pressure sensor, to maximize accurate plantar pressure measurements and estimate the Centre of Pressure (COP) and its trajectory, based on the size of the sensor insole (104).

18. The method as claimed in claim 15, comprising providing the foot pressure data comprises at least one of a mean pressure, peak pressure, toe and heel pressure, and location of center of pressure (COP).

19. The method as claimed in claim 15, comprising providing the inertial data comprises instantaneous position, rotation angle, stride length, lower limb rotation, movement speed, and stride velocity.

20. The method as claimed in claim 15, comprising providing the micro actions related to specific phases of the user gait comprise swing, stance phases in normal walking, weight transfer, balancing, foot movement, rotation and timing.

21. The method as claimed in claim 15, comprising providing the IMU sensor is a 6-axis IMU, the IMU sensor comprises:
capturing a linear acceleration, by a high-performance accelerometer, along three coordinate axes; and
measuring angular velocity around three coordinate axes by a gyroscope.

22. The method as claimed in claim 15, comprising embedding the IoT enabled hardware unit (106) in an IoT layer Module (202),
providing the IoT enabled hardware unit (106) comprises:
processing, by a microcontroller unit (MCU) (204), the foot pressure data and the inertial data from the sensor insole (104), performing initial signal conditioning, and transmitting the processed data wirelessly to the data visualization and control unit (108);
supplying, by a rechargeable battery (214), consistent power to the IoT layer Module (202); and
storing both the foot pressure data and the inertial data permanently, by a micro-SD card (216), for later post-processing, redundancy, backup, and offline data retrieval.

23. The method as claimed in claim 22, comprising providing the microcontroller unit (MCU) (204) for enhancing the accuracy of position, distance, velocity, and angle estimations at the micro-actions level through implementation of sensor fusion techniques, comprising at least one of complementary filtering and an attitude and heading reference system (AHRS).

24. The method as claimed in claim 22, comprising providing the microcontroller unit (MCU) (204) comprises:
facilitating, by a Bluetooth low energy (BLE) stack unit (206) a wireless communication between the IoT layer Module (202) and the data visualization and control unit (108);
capturing, by an on-board inertial measurement unit (208), the acceleration, orientation and angular velocity of the foot; and
managing, by a battery charge controller unit (210), charging of the rechargeable battery (214).

25. The method as claimed in claim 15, comprising providing the data visualization and control unit (108) comprise a mobile device, laptop, or desktop computer, wherein the data visualization and control unit (108) comprise a cross-platform front-end application for real-time interaction with the gait cloud server (112).

26. The method as claimed in claim 15, comprising providing the data visualization and control unit (108) comprise:
facilitating wireless communication by a Bluetooth low energy (BLE) stack module between the data visualization and control unit (108) and the IoT-enabled hardware unit;
presenting real-time and historical gait data by a front-end application module to the user in a user-friendly interface, wherein front end application module (304) comprises:
storing the foot pressure and the inertial data by a cache module (306) for faster access during real-time visualization and ensure smooth operation even in low connectivity conditions;
handling data processing, user interaction, and interface controls, and linking the backend services by an application logic and services module (308); and
facilitating real-time interaction by a user interface module (310) with pressure distribution visualizations across various regions of the foot based on data received from the IoT layer.

27. The method as claimed in claim 15, comprising providing the gait cloud server (112) comprises:
storing the recorded visual data, the foot pressure data and the inertial data by a gait database (402);
providing communication between the front-end application, the gait cloud server (112), and the gait database (402) by a backend services module (404), thereby facilitating data transfer, storage, and retrieval operations; and
performing data analytics by a backend logic module (406), comprising gait pattern recognition, abnormality detection, and personalized feedback generation based on the user's data.


Date: 2nd November, 2024 Signature:
Name of signatory: Nishant Kewalramani
(Patent Agent)
IN/PA number: 1420

28. The method as claimed in claim 15, comprising providing the gait video server (116) comprises a captured correlated video module (408) for synchronizing and displaying the recorded visual data alongside corresponding the foot pressure data and the inertial data, allowing for the micro-actions analysis in real-time or offline.

Date: 2nd November, 2024 Signature:
Name of signatory: Nishant Kewalramani
(Patent Agent)
IN/PA number: 1420

Documents

NameDate
202441084729-Proof of Right [13-11-2024(online)].pdf13/11/2024
202441084729-EVIDENCE OF ELIGIBILTY RULE 24C1f [08-11-2024(online)].pdf08/11/2024
202441084729-FORM 18A [08-11-2024(online)].pdf08/11/2024
202441084729-FORM-8 [07-11-2024(online)].pdf07/11/2024
202441084729-COMPLETE SPECIFICATION [05-11-2024(online)].pdf05/11/2024
202441084729-DECLARATION OF INVENTORSHIP (FORM 5) [05-11-2024(online)].pdf05/11/2024
202441084729-DRAWINGS [05-11-2024(online)].pdf05/11/2024
202441084729-EDUCATIONAL INSTITUTION(S) [05-11-2024(online)].pdf05/11/2024
202441084729-EVIDENCE FOR REGISTRATION UNDER SSI [05-11-2024(online)].pdf05/11/2024
202441084729-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [05-11-2024(online)].pdf05/11/2024
202441084729-FORM 1 [05-11-2024(online)].pdf05/11/2024
202441084729-FORM FOR SMALL ENTITY(FORM-28) [05-11-2024(online)].pdf05/11/2024
202441084729-FORM-9 [05-11-2024(online)].pdf05/11/2024
202441084729-POWER OF AUTHORITY [05-11-2024(online)].pdf05/11/2024
202441084729-REQUEST FOR EARLY PUBLICATION(FORM-9) [05-11-2024(online)].pdf05/11/2024

footer-service

By continuing past this page, you agree to our Terms of Service,Cookie PolicyPrivacy Policy  and  Refund Policy  © - Uber9 Business Process Services Private Limited. All rights reserved.

Uber9 Business Process Services Private Limited, CIN - U74900TN2014PTC098414, GSTIN - 33AABCU7650C1ZM, Registered Office Address - F-97, Newry Shreya Apartments Anna Nagar East, Chennai, Tamil Nadu 600102, India.

Please note that we are a facilitating platform enabling access to reliable professionals. We are not a law firm and do not provide legal services ourselves. The information on this website is for the purpose of knowledge only and should not be relied upon as legal advice or opinion.