Gradient Descent

Gradient Descent is an optimization algorithm for finding the local minimum of a function. It searches for the combination of parameters that minimizes the cost function. For the Gradient Descent to work, the cost function must be differentiable and convex. Visualization of Gradient Descent. Cost Function vs. Weight (W). Source: Read more…