Overview:
This advanced-level webinar dives into the mathematical foundations behind Machine Learning model optimization. Participants will explore how models learn, how performance metrics are calculated, and how optimization algorithms such as Gradient Descent and Stochastic Gradient Descent enable models to minimize error.
The session combines theoretical derivations with practical implementation in Jupyter Notebook, where participants will code optimization algorithms from scratch, observe convergence behavior, and visually compare the performance of different optimization strategies. This webinar is designed for learners who want to move beyond using ML libraries and truly understand the mathematics that powers model training.
Why you should Attend:
Many Machine Learning practitioners use pre-built libraries without understanding how models are actually optimized. Without a deep grasp of optimization techniques like Gradient Descent, model convergence, and training dynamics, you risk being limited to surface-level ML knowledge - unable to debug, tune, or build models beyond standard frameworks.
Areas Covered in the Session:
- How Machine Learning models work at a mathematical level
- Understanding prediction functions and loss functions
- How model accuracy and error are calculated
- Classification metrics (accuracy, loss)
- Regression metrics (MSE, RMSE)
- Concept of optimization in ML
- Gradient Descent:
- Intuition behind gradients
- Derivative of loss functions
- Learning rate and its impact
- Stochastic Gradient Descent (SGD):
- Difference between GD and SGD
- Batch vs mini-batch learning
- Training dynamics:
- Epochs
- Iterations
- Convergence criteria
- Stopping conditions
- Implementing Gradient Descent from scratch in Jupyter Notebook
- Implementing Stochastic Gradient Descent
- Visualizing:
- Loss curves
- Convergence speed
- Parameter updates
- Comparing performance and stability of GD vs SGD
- Practical insights on tuning optimization algorithms
Who Will Benefit:
- Advanced ML students
- AI / Data Science postgraduate students
- Machine Learning Engineers
- Data Scientists
- Research-oriented learners
- Professionals preparing for technical ML interviews