這篇文章主要介紹了Python如何實現(xiàn)線性回歸算法,具有一定借鑒價值,感興趣的朋友可以參考下,希望大家閱讀完這篇文章之后大有收獲,下面讓小編帶著大家一起了解一下。
為恩陽等地區(qū)用戶提供了全套網(wǎng)頁設(shè)計制作服務(wù),及恩陽網(wǎng)站建設(shè)行業(yè)解決方案。主營業(yè)務(wù)為網(wǎng)站建設(shè)、成都網(wǎng)站設(shè)計、恩陽網(wǎng)站設(shè)計,以傳統(tǒng)方式定制建設(shè)網(wǎng)站,并提供域名空間備案等一條龍服務(wù),秉承以專業(yè)、用心的態(tài)度為用戶提供真誠的服務(wù)。我們深信只要達到每一位用戶的要求,就會得到認可,從而選擇與我們長期合作。這樣,我們也可以走得更遠!用python實現(xiàn)線性回歸
代碼:
#encoding:utf-8 """ Author: njulpy Version: 1.0 Data: 2018/04/09 Project: Using Python to Implement LineRegression Algorithm """ import numpy as np import pandas as pd from numpy.linalg import inv from numpy import dot from sklearn.model_selection import train_test_split import matplotlib.pyplot as plt from sklearn import linear_model # 最小二乘法 def lms(x_train,y_train,x_test): theta_n = dot(dot(inv(dot(x_train.T, x_train)), x_train.T), y_train) # theta = (X'X)^(-1)X'Y #print(theta_n) y_pre = dot(x_test,theta_n) mse = np.average((y_test-y_pre)**2) #print(len(y_pre)) #print(mse) return theta_n,y_pre,mse #梯度下降算法 def train(x_train, y_train, num, alpha,m, n): beta = np.ones(n) for i in range(num): h = np.dot(x_train, beta) # 計算預(yù)測值 error = h - y_train.T # 計算預(yù)測值與訓(xùn)練集的差值 delt = 2*alpha * np.dot(error, x_train)/m # 計算參數(shù)的梯度變化值 beta = beta - delt #print('error', error) return beta if __name__ == "__main__": iris = pd.read_csv('iris.csv') iris['Bias'] = float(1) x = iris[['Sepal.Width', 'Petal.Length', 'Petal.Width', 'Bias']] y = iris['Sepal.Length'] x_train, x_test, y_train, y_test = train_test_split(x, y, test_size=0.2, random_state=5) t = np.arange(len(x_test)) m, n = np.shape(x_train) # Leastsquare theta_n, y_pre, mse = lms(x_train, y_train, x_test) # plt.plot(t, y_test, label='Test') # plt.plot(t, y_pre, label='Predict') # plt.show() # GradientDescent beta = train(x_train, y_train, 1000, 0.001, m, n) y_predict = np.dot(x_test, beta.T) # plt.plot(t, y_predict) # plt.plot(t, y_test) # plt.show() # sklearn regr = linear_model.LinearRegression() regr.fit(x_train, y_train) y_p = regr.predict(x_test) print(regr.coef_,theta_n,beta) l1,=plt.plot(t, y_predict) l2,=plt.plot(t, y_p) l3,=plt.plot(t, y_pre) l4,=plt.plot(t, y_test) plt.legend(handles=[l1, l2,l3,l4 ], labels=['GradientDescent', 'sklearn','Leastsquare','True'], loc='best') plt.show()
輸出結(jié)果
sklearn: [ 0.65368836 0.70955523 -0.54193454 0. ]
LeastSquare: [ 0.65368836 0.70955523 -0.54193454 1.84603897]
GradientDescent: [ 0.98359285 0.29325906 0.60084232 1.006859 ]
感謝你能夠認真閱讀完這篇文章,希望小編分享的“Python如何實現(xiàn)線性回歸算法”這篇文章對大家有幫助,同時也希望大家多多支持創(chuàng)新互聯(lián),關(guān)注創(chuàng)新互聯(lián)行業(yè)資訊頻道,更多相關(guān)知識等著你來學(xué)習(xí)!