← Back to Algorithms
Gradient Descent
An optimization algorithm used to find the local minimum of a function, foundational to training machine learning models.
Explanation
Gradient Descent is an iterative first-order optimisation algorithm used to find a local minimum/maximum of a given function. This method is commonly used in machine learning and deep learning to minimise a cost/loss function.
How it Works
Imagine you are at the top of a mountain and you want to get to the lowest point. You look around, find the direction of steepest descent, and take a step. You repeat this process until you reach the bottom. That's Gradient Descent in a nutshell.
Playground
0.10
10.0
Iteration: 0
Current X: 10.00
Gradient: 20.00
Code
function gradientDescent(initialX, learningRate, numIterations) {
let x = initialX;
for (let i = 0; i < numIterations; i++) {
const gradient = 2 * x; // Derivative of f(x) = x^2
x = x - learningRate * gradient;
}
return x;
}