image
image
user-login
Patent search/

An Edge AI-Powered Symbolic Computation System

search

Patent Search in India

  • tick

    Extensive patent search conducted by a registered patent agent

  • tick

    Patent search done by experts in under 48hrs

₹999

₹399

Talk to expert

An Edge AI-Powered Symbolic Computation System

ORDINARY APPLICATION

Published

date

Filed on 15 November 2024

Abstract

The present invention relates to an edge AI-powered symbolic computation system. The proposed system is a transformative approach that leverages edge computing to perform complex symbolic calculations in real time, minimizing latency and enhancing privacy by processing data close to the source. Unlike traditional symbolic computation systems that rely on centralized cloud processing, this system utilizes Edge AI, which integrates AI algorithms at the edge of the network (such as on local devices like mobile phones, IoT devices, and embedded systems). The system is designed to enable fast, efficient, and secure computation for use in mathematical, engineering, and scientific applications, especially where connectivity may be limited or real-time processing is crucial.

Patent Information

Application ID202421088438
Invention FieldCOMPUTER SCIENCE
Date of Application15/11/2024
Publication Number49/2024

Inventors

NameAddressCountryNationality
Dr. Gopal SakarkarAssociate Professor, Department of Computer Science and Applications, Dr Vishwanath Karad MIT World Peace University, Kothrud, Pune, Maharashtra, IndiaIndiaIndia
Dr. Riddhi PanchalAssistant Professor & Program Head MSc (DS&BDA), Department of Computer Science and Application, Dr Vishwanath Karad MIT World Peace University, Kothrud, Pune, Maharashtra, IndiaIndiaIndia
Mr. Swapnil GojeAssistant Professor, Department of Computer Science and Application, Dr Vishwanath Karad MIT World Peace University, Kothrud, Pune, Maharashtra, IndiaIndiaIndia
Mr. Abhijeet KokareAssistant Professor, Department of Computer Science and Application, Dr Vishwanath Karad MIT World Peace University, Kothrud, Pune, Maharashtra, IndiaIndiaIndia
Mr. Vishal Arun PawarAssistant Professor, RCPET's Institute of Management Research and Development, Shirpur, Maharashtra, IndiaIndiaIndia
Dr. Kishor Madhukar DholeAssistant Professor, Department of Computer Science, Seth Kesarimal Porwal College of Arts Science and Commerce Kamptee, Nagpur, Maharashtra, India - 441001IndiaIndia
Dr. Bhaskar Yadao KathaneAssistant Professor, Department of Computer Science, Bhawabhuti Mahavidyalaya Amgaon, Deori Road Amgaon, Gondia, Maharashtra, India - 441902IndiaIndia

Applicants

NameAddressCountryNationality
Dr. Gopal SakarkarAssociate Professor, Department of Computer Science and Applications, Dr Vishwanath Karad MIT World Peace University, Kothrud, Pune, Maharashtra, IndiaIndiaIndia
Dr. Riddhi PanchalAssistant Professor & Program Head MSc (DS&BDA), Department of Computer Science and Application, Dr Vishwanath Karad MIT World Peace University, Kothrud, Pune, Maharashtra, IndiaIndiaIndia
Mr. Swapnil GojeAssistant Professor, Department of Computer Science and Application, Dr Vishwanath Karad MIT World Peace University, Kothrud, Pune, Maharashtra, IndiaIndiaIndia
Mr. Abhijeet KokareAssistant Professor, Department of Computer Science and Application, Dr Vishwanath Karad MIT World Peace University, Kothrud, Pune, Maharashtra, IndiaIndiaIndia
Mr. Vishal Arun PawarAssistant Professor, RCPET's Institute of Management Research and Development, Shirpur, Maharashtra, IndiaIndiaIndia
Dr. Kishor Madhukar DholeAssistant Professor, Department of Computer Science, Seth Kesarimal Porwal College of Arts Science and Commerce Kamptee, Nagpur, Maharashtra, India - 441001IndiaIndia
Dr. Bhaskar Yadao KathaneAssistant Professor, Department of Computer Science, Bhawabhuti Mahavidyalaya Amgaon, Deori Road Amgaon, Gondia, Maharashtra, India - 441902IndiaIndia

Specification

Description:TECHNICAL FIELD OF INVENTION

The present invention relates to an edge AI-powered symbolic computation system.

BACKGROUND OF THE INVENTION

The background information herein below relates to the present disclosure but is not necessarily prior art.

Symbolic computation is central to fields like mathematics, engineering, physics, and computer science, where it is used for complex algebraic operations, calculus, and other advanced problem-solving techniques. Traditional symbolic computation systems often rely on centralized cloud processing, requiring robust internet connectivity and leading to latency and privacy concerns. These limitations make it difficult to apply symbolic computation in low-bandwidth or disconnected environments, and they increase the risk of data security issues when sensitive information is processed through cloud servers.

Recent advances in Edge AI have introduced a promising solution by enabling complex computations directly on local devices such as mobile phones, IoT hardware, and other embedded systems. Edge AI leverages compact, optimized machine learning models that run efficiently on low-power devices, opening possibilities for real-time, localized data processing without cloud dependency. By integrating Edge AI with symbolic computation, this system can achieve near-instantaneous response times and maintain data security, supporting real-time use cases in remote, offline, and security-sensitive settings. This shift also reduces bandwidth requirements, enhances scalability, and empowers users with high processing power in personal or isolated setups.

There are various drawbacks prior art/existing technology. Hence there was a long felt need in the art.

OBJECTIVE OF THE INVENTION

The primary objective of the present invention is to provide an edge AI-powered symbolic computation system.

This and other objects and characteristics of the present invention will become apparent from the further disclosure to be made in the detailed description given below.

SUMMARY OF THE INVENTION

Accordingly, the following invention provides an edge AI-powered symbolic computation system.

The proposed system brings advanced mathematical computations directly to the edge device, reducing dependency on cloud resources and enhancing privacy. Using optimized deep learning models, it allows for algebraic manipulation, equation solving, and other symbolic calculations on low-power devices, ideal for disconnected or low-bandwidth environments. The system is designed to perform symbolic differentiation, integration, simplification, and matrix operations efficiently on local devices.

Key features include low-latency computation, data privacy, real-time processing, and high efficiency on low-power hardware through model compression and optimization. With applications in education, technical design, and healthcare, it enables real-time symbolic computations, scalable deployment, and secure, self-sufficient processing in remote or resource-constrained settings.

DETAILED DESCRIPTION OF THE INVENTION

As used in the description herein and throughout the claims that follow, the meaning of "a," "an," and "the" includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein, the meaning of "in" includes "in" and "on" unless the context clearly dictates otherwise.

The present invention is related to an edge AI-powered symbolic computation system.

The proposed system brings computation closer to the user, reducing dependency on centralized resources. The system uses deep learning models optimized for low-power devices to execute symbolic computations, which are traditionally resource-intensive. By embedding symbolic computation capabilities on the edge device, users can perform algebraic manipulations, solve equations, and conduct other symbolic calculations locally, even in disconnected or low-bandwidth environments.

This approach includes specialized models and algorithms tailored to execute symbolic mathematics efficiently on edge hardware. For example, symbolic differentiation, integration, algebraic simplification, and matrix operations can be performed directly on devices using optimized AI models that employ quantization, pruning, and other hardware-aware techniques to run efficiently on low-power hardware.

The proposed system comprises of key features:

Low-Latency Computation: Reduces the need for cloud communication, making calculations faster.

Data Privacy: Ensures sensitive data stays on the device, reducing data leakage risks.

Real-Time Processing: Optimized for applications that require quick response times without the latency introduced by cloud-based computation.

High Efficiency on Low-Power Devices: Customizable for low-power hardware and optimized through model quantization and pruning.

Methodology:

Symbolic Computation Model Design:

Develop AI-based models capable of symbolic computation tasks (e.g., calculus operations, algebraic simplification) using datasets of symbolic expressions and associated results.

Train models on high-powered systems and deploy lightweight, optimized versions on edge devices.

Implement machine learning techniques like transfer learning to adapt the models to various mathematical disciplines and user-specific tasks.

Edge Device Model Optimization:

Use model compression techniques, such as pruning and quantization, to reduce model size without compromising accuracy.

Integrate hardware-aware optimizations, ensuring compatibility with the target edge hardware (e.g., ARM Cortex processors, specialized AI chips).

Apply efficient libraries (e.g., TensorFlow Lite, ONNX Runtime) for mobile and embedded devices to support AI model deployment.

Local Data Processing and Security:

Utilize federated learning if user-specific adaptations are required, ensuring that model adjustments can occur without transferring data to the cloud.

Incorporate secure enclaves for computation to further enhance data protection by isolating sensitive computations.

Dynamic Workload Allocation:

Enable flexible computation offloading that adjusts between edge and cloud based on available resources and network conditions.

Implement dynamic load balancing to distribute computational tasks between edge devices if more than one device is available in a local network.

System Architecture

User Interface Layer: Provides a user-friendly interface for inputting mathematical expressions and selecting computation types. Interacts with the computation engine to receive and display results in real time.

Edge AI Computation Engine: Symbolic Computation Model: This module performs the actual symbolic processing using AI-optimized models. It supports differentiation, integration, simplification, and matrix operations. Inference Optimization Module: Handles model optimization for the edge device, including dynamic quantization and hardware-accelerated computation.

Security Module: Ensures data remains secure and confined to the device using secure enclaves and encrypted memory.

Edge Device Operating Layer: Runtime Environment: A lightweight runtime (such as TensorFlow Lite or ONNX Runtime) capable of running AI inference with low power consumption.

Data Processing and Storage: Manages temporary storage of computation results, allowing caching and quick retrieval of previous calculations.

Device Resource Management: Optimizes memory and processing power, allocating resources dynamically based on task complexity.

Optional Cloud Interface Layer: Allows optional cloud connectivity for model updates, federated learning, or workload offloading if local resources are insufficient. Ensures that any data sent to the cloud is anonymized and only shared with user consent for additional processing.

Advantages:

Scalability and Portability: Can be deployed across various edge devices without requiring heavy computational power.

Enhanced Privacy and Security: Local computation minimizes the risk of data breaches, especially important for sensitive applications.

Real-Time Responsiveness: Suitable for applications like automated math tutoring, technical design, and engineering computation where instant feedback is necessary.

Reduced Cloud Dependency: Ideal for remote or connectivity-limited environments, promoting self-sufficient computing.

Potential Applications

Educational Platforms: On-device AI-assisted mathematics tools for students in remote areas.

Technical Design Tools: Assists engineers and scientists in performing complex calculations in real-time on local devices.

Healthcare and Biotech Research: Enables on-device computational biology or bioinformatics without exposing sensitive data.

The Edge AI-Powered Symbolic Computation System is positioned to transform on-device computation, enabling users to achieve high accuracy in symbolic computations without relying on cloud resources, thus ensuring efficiency, privacy, and adaptability across diverse fields.

While various embodiments of the present disclosure have been illustrated and described herein, it will be clear that the disclosure is not limited to these embodiments only. Numerous modifications, changes, variations, substitutions, and equivalents will be apparent to those skilled in the art, without departing from the spirit and scope of the disclosure, as described in the claims.
, Claims:1. An edge AI-powered symbolic computation system, comprising:

an edge AI computation engine configured to perform symbolic calculations including algebraic manipulations, differentiation, integration, algebraic simplification, and matrix operations locally on low-power devices;

a symbolic computation model optimized using machine learning techniques to execute complex symbolic tasks on embedded hardware by leveraging model compression, quantization, pruning, and hardware-aware optimizations;

a data processing and storage module configured to manage local data and cache computational results, thereby enabling low-latency, real-time processing without reliance on cloud resources;

a security module to ensure data privacy by processing computations locally within secure enclaves;

a dynamic workload allocation module to balance computational tasks between edge devices or selectively offload to cloud based on resource availability, further including an optional cloud interface for model updates or federated learning; and

a user interface layer enabling user interaction with the system for inputting symbolic expressions and receiving computed results in real-time,

wherein the system provides efficient and secure symbolic computation on low-power edge devices suitable for remote or bandwidth-limited environments.

Documents

NameDate
202421088438-COMPLETE SPECIFICATION [15-11-2024(online)].pdf15/11/2024
202421088438-DECLARATION OF INVENTORSHIP (FORM 5) [15-11-2024(online)].pdf15/11/2024
202421088438-FORM 1 [15-11-2024(online)].pdf15/11/2024
202421088438-FORM-9 [15-11-2024(online)].pdf15/11/2024
202421088438-REQUEST FOR EARLY PUBLICATION(FORM-9) [15-11-2024(online)].pdf15/11/2024

footer-service

By continuing past this page, you agree to our Terms of Service,Cookie PolicyPrivacy Policy  and  Refund Policy  © - Uber9 Business Process Services Private Limited. All rights reserved.

Uber9 Business Process Services Private Limited, CIN - U74900TN2014PTC098414, GSTIN - 33AABCU7650C1ZM, Registered Office Address - F-97, Newry Shreya Apartments Anna Nagar East, Chennai, Tamil Nadu 600102, India.

Please note that we are a facilitating platform enabling access to reliable professionals. We are not a law firm and do not provide legal services ourselves. The information on this website is for the purpose of knowledge only and should not be relied upon as legal advice or opinion.