-
Gradient Descent - A Detailed Walkthrough
An in-depth exploration of gradient descent, including its convergence and step size considerations.
-
Empirical Risk Minimization (ERM)
Exploring Empirical Risk Minimization - Balancing approximation, estimation, and optimization errors to build effective supervised learning models.
-
Understanding the Supervised Learning Setup
An in-depth exploration of the supervised learning setup, covering key concepts like prediction functions, loss functions, risk evaluation, and the Bayes optimal predictor.
-
Timeline of Machine Learning History
A concise timeline of machine learning's history, showcasing key milestones and breakthroughs that shaped the field.
-
Advanced Probability Concepts for Machine Learning
This blog explores key probability theory concepts, from distributions and Bayes' Theorem to covariance and the Central Limit Theorem, emphasizing their critical application in machine learning and statistical modeling.