image
image
user-login
Patent search/

COMPARING OPTIMIZATION ALGORITHMS: STOCHASTIC GRADIENT DESCENT AND ADAM ON THE IRIS DATASET

search

Patent Search in India

  • tick

    Extensive patent search conducted by a registered patent agent

  • tick

    Patent search done by experts in under 48hrs

₹999

₹399

Talk to expert

COMPARING OPTIMIZATION ALGORITHMS: STOCHASTIC GRADIENT DESCENT AND ADAM ON THE IRIS DATASET

ORDINARY APPLICATION

Published

date

Filed on 20 November 2024

Abstract

This project proposes a comparative analysis of two popular optimization algorithms, Stochastic Gradient Descent (SGD) and Adam, using the Iris dataset. The study leverages· Python and popular machine learning libraries to implement and evaluate these algorithms. The Iris dataset, a well-known benchmark in machine learning, is used to train and test simple neural network models optimized with SGD and Adam. The performance of each algorithm is assessed based on metrics such as convergence speed, final accuracy, and stability across multiple runs. By comparing these optimization techniques on a standard dataset, the project aims to provide insights into their relative strengths and weaknesses in the context of a classic classification problem.

Patent Information

Application ID202441089992
Invention FieldCOMPUTER SCIENCE
Date of Application20/11/2024
Publication Number48/2024

Inventors

NameAddressCountryNationality
P. S. RohitSAVEETHA INSTITUTE OF MEDICAL AND TECHNICAL SCIENCES, SAVEETHA NAGAR, THANDALAM, CHENNAI-602105IndiaIndia
Nagaram Kiran KumarSAVEETHA INSTITUTE OF MEDICAL AND TECHNICAL SCIENCES, SAVEETHA NAGAR, THANDALAM, CHENNAI-602105IndiaIndia
A.SeethalakshmySAVEETHA INSTITUTE OF MEDICAL AND TECHNICAL SCIENCES, SAVEETHA NAGAR, THANDALAM, CHENNAI-602105IndiaIndia
G SelviSAVEETHA INSTITUTE OF MEDICAL AND TECHNICAL SCIENCES, SAVEETHA NAGAR, THANDALAM, CHENNAI-602105IndiaIndia
D IranianSAVEETHA INSTITUTE OF MEDICAL AND TECHNICAL SCIENCES, SAVEETHA NAGAR, THANDALAM, CHENNAI-602105IndiaIndia
R RevathiSAVEETHA INSTITUTE OF MEDICAL AND TECHNICAL SCIENCES, SAVEETHA NAGAR, THANDALAM, CHENNAI-602105IndiaIndia
S PoornavelSAVEETHA INSTITUTE OF MEDICAL AND TECHNICAL SCIENCES, SAVEETHA NAGAR, THANDALAM, CHENNAI-602105IndiaIndia
M Eswara RaoSAVEETHA INSTITUTE OF MEDICAL AND TECHNICAL SCIENCES, SAVEETHA NAGAR, THANDALAM, CHENNAI-602105IndiaIndia
Ramya MohanSAVEETHA INSTITUTE OF MEDICAL AND TECHNICAL SCIENCES, SAVEETHA NAGAR, THANDALAM, CHENNAI-602105IndiaIndia

Applicants

NameAddressCountryNationality
SAVEETHA INSTITUTE OF MEDICAL AND TECHNICAL SCIENCESSAVEETHA INSTITUTE OF MEDICAL AND TECHNICAL SCIENCES, SAVEETHA NAGAR, THANDALAM, CHENNAI-602105IndiaIndia

Specification

PREAMBLE TO THE DESCRPTION
THE FIELD OF INVENTION (MACHINE LEARNING OPTIMIZATION)
The study pertains to advancements in machine teaming optimization techniques, specifically comparing
the performance of Stochastic Gradient Descent and Adam algorithms on a standard dataset to enhance
model training efficiency and effectiveness.
BACKGROUND OF THE INVENTION
Machine learning models, particularly neural networks, rely heavily on optimization algorithms to minimize the
loss function and improve prediction accuracy. Two popular optimization algorithms, Stochastic Gradient Descent
(SGD) and Adam (Adaptive Moment Estimation), have gained significant attention in recent years due to their
effectiveness in training deep learning models.
Stochastic Gradient Descent, an extension of the traditional gradient descent algorithm, updates model
parameters using a subset of training data in each iteration. This approach reduces computational cost
and often leads to faster convergence, especially for large datasets.
SUMMARY OF THE INVENTION
This study compares the performance of Stochastic Gradient Descent (SGD) and Adam optimization
algorithms using the Iris dataset. By implementing simple neural network models and training them with
both algorithms, the research aims to identify differences in convergence speed, final accuracy, and
stability. The use of a well-known, standardized dataset allows for a controlled comparison, providing
valuable insights into the strengths and weaknesses of each optimization technique in the context of a
classic classification problem.
COMPLETE SPECIFICATION
• Stochastic Gradient Descent (SGD)
• Adam (Adaptive Moment Estimation)
Dataset:
• Iris dataset (150 samples, 4 features, 3 classes)
Model Architecture:
• Simple feed forward neural network
Implementation:
• Python programming language
• Tensor Flow or PyTorch for neural network implementation
Performance Metrics:
• Convergence speed (number of epochs to reach a specific accuracy)
• Final accuracy on test set
• Stability (variance in performance across multiple runs)
Visualization:
• Learning curves (loss and accuracy vs. epochs)
Parameter update trajectories
Hyper parameter Tuning: ·
• Grid search for optimal learning rates and other hyper parameters
Statistical Analysis:
• T-tests or AN OVA to determine significance of performance differences
We Claim
• A comparative analysis system for optimization algorithms, specifically evaluating Stochastic
Gradient Descent and Adam, using the Iris dataset to provide insights into their relative performance in
a classic classification task.
• The system generates comprehensive performance metrics, including convergence speed, final
accuracy, and stability measures, enabling a thorough comparison of SGD and Adam.
• The analysis includes visualizations of learning cu1ves and pararn<::ter update trajectories, offering
intuitive insights into the behaviour of each optimization algorithm during the training process.
• The system incorporates hyper parameter tuning experiments, providing valuable information on how
each algorithm's performance varies with different configurations. ·
• Statistical analyses are performed to determine the significance of observed performance differences,
enhancing the reliability of the comparison results.
• The comparative analysis system can be adapted to evaluate other optimization algorithms or applied
to different datasets, offering a flexible framework for algorithm comparison in machine learning
contexts.

Documents

NameDate
202441089992-Form 1-201124.pdf22/11/2024
202441089992-Form 18-201124.pdf22/11/2024
202441089992-Form 2(Title Page)-201124.pdf22/11/2024
202441089992-Form 3-201124.pdf22/11/2024
202441089992-Form 5-201124.pdf22/11/2024
202441089992-Form 9-201124.pdf22/11/2024

footer-service

By continuing past this page, you agree to our Terms of Service,Cookie PolicyPrivacy Policy  and  Refund Policy  © - Uber9 Business Process Services Private Limited. All rights reserved.

Uber9 Business Process Services Private Limited, CIN - U74900TN2014PTC098414, GSTIN - 33AABCU7650C1ZM, Registered Office Address - F-97, Newry Shreya Apartments Anna Nagar East, Chennai, Tamil Nadu 600102, India.

Please note that we are a facilitating platform enabling access to reliable professionals. We are not a law firm and do not provide legal services ourselves. The information on this website is for the purpose of knowledge only and should not be relied upon as legal advice or opinion.