An optimization algorithm used to minimize some function by iteratively moving in the direction of steepest descent, as defined by the negative of the gradient.
"Gradient descent is widely used in machine learning to find the parameters that minimize a loss function."