-
Loss Functions - Regression and Classification
Exploring regression and classification loss functions, with a deep dive into logistic regression and its role in machine learning.
-
Optimizing Stochastic Gradient Descent - Key Recommendations for Effective Training
A comprehensive collection of expert recommendations to enhance the performance and reliability of Stochastic Gradient Descent, ensuring smoother and faster convergence during training.
-
Gradient Descent and Second-Order Optimization - A Thorough Comparison
An in-depth exploration of Gradient Descent (GD) and Second-Order Gradient Descent (2GD), focusing on convergence behavior, mathematical derivations, and performance differences.
-
Gradient Descent Convergence - Prerequisites and Detailed Derivation
Understanding the convergence of gradient descent with a fixed step size and proving its rate of convergence for convex, differentiable functions.
-
Understanding Stochastic Gradient Descent (SGD)
A detailed guide to gradient descent variants, highlighting the mechanics, trade-offs, and practical insights of Stochastic Gradient Descent (SGD).