Skip to content

Instantly share code, notes, and snippets.

@bgreatfit
Created October 7, 2018 04:45
Show Gist options
  • Select an option

  • Save bgreatfit/56b87629c83f85991b964da18e2c56a1 to your computer and use it in GitHub Desktop.

Select an option

Save bgreatfit/56b87629c83f85991b964da18e2c56a1 to your computer and use it in GitHub Desktop.
Gradient Descent
#gradient descent
def gradientDescent(X,y,theta,iters,alpha):
cost = np.zeros(iters)
for i in range(iters):
theta = theta - (alpha/len(X)) * np.sum(X * (X @ theta.T - y), axis=0)
cost[i] = computeCost(X, y, theta)
return theta,cost
#running the gd and cost function
g,cost = gradientDescent(X,y,theta,iters,alpha)
print(g)
finalCost = computeCost(X,y,g)
print(finalCost)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment