Skip to content

Gradient Descent is the process of minimizing a function by following the gradients of the cost function. This involves knowing the form of the cost as well as the derivative so that from a given point you know the gradient and can move in that direction, e.g. downhill towards the minimum value.

Notifications You must be signed in to change notification settings

bhargav-joshi/Gradient-Descent-in-Linear-Regression

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

Gradient-Descent-in-Linear-Regression

Gradient Descent is the process of minimizing a function by following the gradients of the cost function. This involves knowing the form of the cost as well as the derivative so that from a given point you know the gradient and can move in that direction, e.g. downhill towards the minimum value.

First install these Libraries

import numpy as np

import matplotlib.pyplot as plt

You will need Jupyter notebook or Google Collab to see this Notebook

About

Gradient Descent is the process of minimizing a function by following the gradients of the cost function. This involves knowing the form of the cost as well as the derivative so that from a given point you know the gradient and can move in that direction, e.g. downhill towards the minimum value.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published