您好,登錄后才能下訂單哦!
這篇文章主要講解了“怎么理解并掌握Python邏輯回歸”,文中的講解內(nèi)容簡單清晰,易于學(xué)習(xí)與理解,下面請大家跟著小編的思路慢慢深入,一起來研究和學(xué)習(xí)“怎么理解并掌握Python邏輯回歸”吧!
def sigmoid(x):定義sigmoid函數(shù)
return 1/(1+np.exp(-x))
進行邏輯回歸的參數(shù)設(shè)置以及迭代
def weights(x,y,alpha,thershold): #初始化參數(shù) m,n = x_train.shape theta = np.random.rand(n) #參數(shù) cnt = 0 # 迭代次數(shù) max_iter = 50000 #開始迭代 while cnt < max_iter: cnt += 1 diff = np.full(n,0) for i in range(m): diff = (y[i]-sigmoid(theta.T @ x[i]))*x[i] theta = theta + alpha * diff if(abs(diff)<thershold).all(): break return theta
預(yù)測函數(shù)
def predict(x_test,theta): if sigmoid(theta.T @ x_test)>0.5: return 1 else:return 0
調(diào)用函數(shù)
x_train = np.array([[1,2.697,6.254], [1,1.872,2.014], [1,2.312,0.812], [1,1.983,4.990], [1,0.932,3.920], [1,1.321,5.583], [1,2.215,1.560], [1,1.659,2.932], [1,0.865,7.362], [1,1.685,4.763], [1,1.786,2.523]]) y_train = np.array([1,0,0,1,0,1,0,0,1,0,1]) alpha = 0.001 # 學(xué)習(xí)率 thershold = 0.01 # 指定一個閾值,用于檢查兩次誤差 print(weights(x_train,y_train,alpha,thershold))
感謝各位的閱讀,以上就是“怎么理解并掌握Python邏輯回歸”的內(nèi)容了,經(jīng)過本文的學(xué)習(xí)后,相信大家對怎么理解并掌握Python邏輯回歸這一問題有了更深刻的體會,具體使用情況還需要大家實踐驗證。這里是億速云,小編將為大家推送更多相關(guān)知識點的文章,歡迎關(guān)注!
免責(zé)聲明:本站發(fā)布的內(nèi)容(圖片、視頻和文字)以原創(chuàng)、轉(zhuǎn)載和分享為主,文章觀點不代表本網(wǎng)站立場,如果涉及侵權(quán)請聯(lián)系站長郵箱:is@yisu.com進行舉報,并提供相關(guān)證據(jù),一經(jīng)查實,將立刻刪除涉嫌侵權(quán)內(nèi)容。