The gradient vector is a vector that points in the direction of the steepest increase of a function at a given point. Gradient descent is an optimization algorithm that iteratively moves towards the minimum of a function by taking steps proportional to the negative of the gradient (antigradient) at the current point. It is used in various fields like: machine learning, deep learning, computer vision, physics simulations and so on.
Gradient descent visually:
There are more efficient and effective ways to find local minimum of a function like Stochastic Gradient Descent, Mini-batch Gradient Descent, Newton's Method, Conjugate Gradient Method... so this is only just for understanding a concept of gradient descent.
Gradient.c is a simple script for finding local minimum of a 2 argument function z=f(x,y) starting from given point (x,y). MinimumCalculator uses constant size step vectors while MinimumCalculator2 uses anti-gradient vector itself. The second one works better (in my example) since it breaks from loop after smaller number of iterations and the closer it gets to minimum, the more precise is estimation. With constant step size an oscillation around minimum is likely to occur.
This script will actually find stationary points that are candidates for local minimum. To find actual local minimum, Hessian matrix determinant should be calculated and cases examined. There are also special cases of functions where function is flat or undefined. And also, this script will only find the closest local minimum from given point.