您好,登錄后才能下訂單哦!
這篇文章主要介紹Python中用numpy解決梯度下降最小值的方法,文中介紹的非常詳細,具有一定的參考價值,感興趣的小伙伴們一定要看完!
問題描述:求解y1 = xx -2 x +3 + 0.01*(-1到1的隨機值) 與 y2 = 0 的最小距離點(x,y)
給定x范圍(0,3)
不使用學習框架,手動編寫梯度下降公式求解,提示:x = x - alp*(y1-y2)導數(shù)(alp為學習率)
函數(shù)圖像為:
代碼內(nèi)容:
import numpy as np import matplotlib.pyplot as plt def get_loss(x): c,r = x.shape loss = (x**2 - 2*x + 3) + (0.01*(2*np.random.rand(c,r)-1)) return(loss) x = np.arange(0,3,0.01).reshape(-1,1) """plt.title("loss") plt.plot(get_loss(np.array(x))) plt.show()""" def get_grad(x): grad = 2 * x -2 return(grad) np.random.seed(31415) x_ = np.random.rand(1)*3 x_s = [] alp = 0.001 print("X0",x_) for e in range(2000): x_ = x_ - alp*(get_grad(x_)) x_s.append(x_) if(e%100 == 0): print(e,"steps,x_ = ",x_) plt.title("loss") plt.plot(get_loss(np.array(x_s))) plt.show()
運行結(jié)果:
X0 [1.93745582] 0 steps,x_ = [1.93558091] 100 steps,x_ = [1.76583547] 200 steps,x_ = [1.6268875] 300 steps,x_ = [1.51314929] 400 steps,x_ = [1.42004698] 500 steps,x_ = [1.34383651] 600 steps,x_ = [1.28145316] 700 steps,x_ = [1.23038821] 800 steps,x_ = [1.18858814] 900 steps,x_ = [1.15437199] 1000 steps,x_ = [1.12636379] 1100 steps,x_ = [1.1034372] 1200 steps,x_ = [1.08467026] 1300 steps,x_ = [1.06930826] 1400 steps,x_ = [1.05673344] 1500 steps,x_ = [1.04644011] 1600 steps,x_ = [1.03801434] 1700 steps,x_ = [1.03111727] 1800 steps,x_ = [1.02547157] 1900 steps,x_ = [1.02085018]
圖片
以上是Python中用numpy解決梯度下降最小值的方法的所有內(nèi)容,感謝各位的閱讀!希望分享的內(nèi)容對大家有幫助,更多相關知識,歡迎關注億速云行業(yè)資訊頻道!
免責聲明:本站發(fā)布的內(nèi)容(圖片、視頻和文字)以原創(chuàng)、轉(zhuǎn)載和分享為主,文章觀點不代表本網(wǎng)站立場,如果涉及侵權(quán)請聯(lián)系站長郵箱:is@yisu.com進行舉報,并提供相關證據(jù),一經(jīng)查實,將立刻刪除涉嫌侵權(quán)內(nèi)容。