September 29
@
8:00 am
–
November 30
@
5:00 pm
Course Description
This course covers everything you need to know about physics-informed neural networks (PINNs) and state-of-the-art neural operators. Learn the theory, implementation, and limitations of scientific machine learning models that bridge the gap between traditional numerical methods and modern deep learning.
In this course, you will engage in hands-on activities and solve real-world partial differential equations while receiving expert guidance from our instructors. By the end of this course, you’ll have the knowledge and confidence to tackle any scientific computing challenge using machine learning. Join us and become a leader in the intersection of AI and scientific computing!
Course At a Glance
Instructional Period: September 29 – October 26, 2025
Time to Complete Badge: 60 days
Last Dy to Earn Badge: November 30, 2025
Expertise Level: Intermediate/Advanced
Scientific Machine Learning
Parallelism in Deep Learning
- Basic Python programming
- Basic understanding of numerical methods and deep learning
- Familiarity with partial differential equations (helpful but not required)
Audience: The course is intended for a broad audience within the spectrum of the software and technology industry, including software engineers, data scientists, data engineers, data analysts, research scientists, and software developers. The course is designed to provide advanced understanding of scientific machine learning for professionals ready to tackle complex computational challenges in science and engineering.
By the end of the course, you should be able to:
- Develop and implement physics-informed neural networks (PINNs)
- Design and apply state-of-the-art neural operators (FNO, DeepONet) for learning PDEs from data
- Understand the theoretical foundations and limitations of scientific machine learning
- Solve partial differential equations using PyTorch-based machine learning approaches
- Bridge traditional numerical methods with modern deep learning techniques
- 1 Quizzes to help design and debug SciML models
- 3 Coding Exercise Questions which include implementing Python codes based on hands-on activities. This includes coding a simple physics-informed neural network and using state-of-the-art neural operators (FNO, DeepONet) to learn PDEs from data.
Module 1: Introduction to Scientific Machine Learning and Physics-Informed Neural Networks
Module 2: Theory and Implementation of PINNs
Module 3: Neural Operators: Fourier Neural Operators (FNO)
Module 4: Advanced Neural Operators: DeepONet and Applications
The course starts on September 28, 2025. All coursework must be completed by November 30, 2025, in order to earn the micro-credential badge. You will continue to have access to the course materials until May 1, 2026. The approximate time to complete this course is 16 hours.
This course has an instructional period from September 29 to October 26, 2025. During this instructional period, course materials will be released weekly and live synchronous sessions will be held. You may complete the course materials at your own pace. Live Zoom meetings will be conducted for interactive coding sessions. A suitable time for these live sessions will be determined through a group poll. The recordings of those sessions will be available soon after each meeting.
You will receive the Scientific Machine Learning micro-credential badge upon successful completion of the course assessments.
Course materials are provided within the course. No additional purchase is required.
Registration
Register for the course as a 1-credit independent study course with a maximum of 3 such TrAC courses per semester.
Students must register for the independent study by emailing benearl@iastate.edu and cc-ing baditya@iastate.edu. Also, fill this Google form for our records: https://forms.gle/mQRefUJ4qpa29s5w6
Industry Professionals/ISU Staff/Post Docs
$
500
.00
ISU Professionals/Staff and Government Employees: $300
Raghu Nandan Pratoori is a Postdoctoral Researcher working with Dr. Abhay Ramachandra and Dr. Baskar Ganapathysubramanian.