小編給大家分享一下python中實(shí)現(xiàn)BP神經(jīng)網(wǎng)絡(luò)原理是什么,希望大家閱讀完這篇文章之后都有所收獲,下面讓我們一起去探討吧!
本文主要講如何不依賴TenserFlow等高級API實(shí)現(xiàn)一個(gè)簡單的神經(jīng)網(wǎng)絡(luò)來做分類,所有的代碼都在下面;在構(gòu)造的數(shù)據(jù)(通過程序構(gòu)造)上做了驗(yàn)證,經(jīng)過1個(gè)小時(shí)的訓(xùn)練分類的準(zhǔn)確率可以達(dá)到97%。
先來說說原理
網(wǎng)絡(luò)構(gòu)造
上面是一個(gè)簡單的三層網(wǎng)絡(luò);輸入層包含節(jié)點(diǎn)X1 , X2;隱層包含H1,H2;輸出層包含O1。
輸入節(jié)點(diǎn)的數(shù)量要等于輸入數(shù)據(jù)的變量數(shù)目。
隱層節(jié)點(diǎn)的數(shù)量通過經(jīng)驗(yàn)來確定。
如果只是做分類,輸出層一般一個(gè)節(jié)點(diǎn)就夠了。
從輸入到輸出的過程
1.輸入節(jié)點(diǎn)的輸出等于輸入,X1節(jié)點(diǎn)輸入x1時(shí),輸出還是x1.
2. 隱層和輸出層的輸入I為上層輸出的加權(quán)求和再加偏置,輸出為f(I) , f為激活函數(shù),可以取sigmoid。H1的輸出為 sigmoid(w1x1 + w2x2 + b)
誤差反向傳播的過程
Python實(shí)現(xiàn)
構(gòu)造測試數(shù)據(jù)
# -*- coding: utf-8 -*- import numpy as np from random import random as rdn ''' 說明:我們構(gòu)造1000條數(shù)據(jù),每條數(shù)據(jù)有三個(gè)屬性(用a1 , a2 , a3表示) a1 離散型 取值 1 到 10 , 均勻分布 a2 離散型 取值 1 到 10 , 均勻分布 a3 連續(xù)型 取值 1 到 100 , 且符合正態(tài)分布 各屬性之間獨(dú)立。 共2個(gè)分類(0 , 1),屬性值與類別之間的關(guān)系如下, 0 : a1 in [1 , 3] and a2 in [4 , 10] and a3 <= 50 1 : a1 in [1 , 3] and a2 in [4 , 10] and a3 > 50 0 : a1 in [1 , 3] and a2 in [1 , 3] and a3 > 30 1 : a1 in [1 , 3] and a2 in [1 , 3] and a3 <= 30 0 : a1 in [4 , 10] and a2 in [4 , 10] and a3 <= 50 1 : a1 in [4 , 10] and a2 in [4 , 10] and a3 > 50 0 : a1 in [4 , 10] and a2 in [1 , 3] and a3 > 30 1 : a1 in [4 , 10] and a2 in [1 , 3] and a3 <= 30 ''' def genData() : #為a3生成符合正態(tài)分布的數(shù)據(jù) a3_data = np.random.randn(1000) * 30 + 50 data = [] for i in range(1000) : #生成a1 a1 = int(rdn()*10) + 1 if a1 > 10 : a1 = 10 #生成a2 a2 = int(rdn()*10) + 1 if a2 > 10 : a2 = 10 #取a3 a3 = a3_data[i] #計(jì)算這條數(shù)據(jù)對應(yīng)的類別 c_id = 0 if a1 <= 3 and a2 >= 4 and a3 <= 50 : c_id = 0 elif a1 <= 3 and a2 >= 4 and a3 > 50 : c_id = 1 elif a1 <= 3 and a2 < 4 and a3 > 30 : c_id = 0 elif a1 <= 3 and a2 < 4 and a3 <= 30 : c_id = 1 elif a1 > 3 and a2 >= 4 and a3 <= 50 : c_id = 0 elif a1 > 3 and a2 >= 4 and a3 > 50 : c_id = 1 elif a1 > 3 and a2 < 4 and a3 > 30 : c_id = 0 elif a1 > 3 and a2 < 4 and a3 <= 30 : c_id = 1 else : print('error') #拼合成字串 str_line = str(i) + ',' + str(a1) + ',' + str(a2) + ',' + str(a3) + ',' + str(c_id) data.append(str_line) return '\n'.join(data)
激活函數(shù)
# -*- coding: utf-8 -*- """ Created on Sun Dec 2 14:49:31 2018 @author: congpeiqing """ import numpy as np #sigmoid函數(shù)的導(dǎo)數(shù)為 f(x)*(1-f(x)) def sigmoid(x) : return 1/(1 + np.exp(-x))
網(wǎng)絡(luò)實(shí)現(xiàn)
# -*- coding: utf-8 -*- """ Created on Sun Dec 2 14:49:31 2018 @author: congpeiqing """ from activation_funcs import sigmoid from random import random class InputNode(object) : def __init__(self , idx) : self.idx = idx self.output = None def setInput(self , value) : self.output = value def getOutput(self) : return self.output def refreshParas(self , p1 , p2) : pass class Neurode(object) : def __init__(self , layer_name , idx , input_nodes , activation_func = None , powers = None , bias = None) : self.idx = idx self.layer_name = layer_name self.input_nodes = input_nodes if activation_func is not None : self.activation_func = activation_func else : #默認(rèn)取 sigmoid self.activation_func = sigmoid if powers is not None : self.powers = powers else : self.powers = [random() for i in range(len(self.input_nodes))] if bias is not None : self.bias = bias else : self.bias = random() self.output = None def getOutput(self) : self.output = self.activation_func(sum(map(lambda x : x[0].getOutput()*x[1] , zip(self.input_nodes, self.powers))) + self.bias) return self.output def refreshParas(self , err , learn_rate) : err_add = self.output * (1 - self.output) * err for i in range(len(self.input_nodes)) : #調(diào)用子節(jié)點(diǎn) self.input_nodes[i].refreshParas(self.powers[i] * err_add , learn_rate) #調(diào)節(jié)參數(shù) power_delta = learn_rate * err_add * self.input_nodes[i].output self.powers[i] += power_delta bias_delta = learn_rate * err_add self.bias += bias_delta class SimpleBP(object) : def __init__(self , input_node_num , hidden_layer_node_num , trainning_data , test_data) : self.input_node_num = input_node_num self.input_nodes = [InputNode(i) for i in range(input_node_num)] self.hidden_layer_nodes = [Neurode('H' , i , self.input_nodes) for i in range(hidden_layer_node_num)] self.output_node = Neurode('O' , 0 , self.hidden_layer_nodes) self.trainning_data = trainning_data self.test_data = test_data #逐條訓(xùn)練 def trainByItem(self) : cnt = 0 while True : cnt += 1 learn_rate = 1.0/cnt sum_diff = 0.0 #對于每一條訓(xùn)練數(shù)據(jù)進(jìn)行一次訓(xùn)練過程 for item in self.trainning_data : for i in range(self.input_node_num) : self.input_nodes[i].setInput(item[i]) item_output = item[-1] nn_output = self.output_node.getOutput() #print('nn_output:' , nn_output) diff = (item_output-nn_output) sum_diff += abs(diff) self.output_node.refreshParas(diff , learn_rate) #print('refreshedParas') #結(jié)束條件 print(round(sum_diff / len(self.trainning_data) , 4)) if sum_diff / len(self.trainning_data) < 0.1 : break def getAccuracy(self) : cnt = 0 for item in self.test_data : for i in range(self.input_node_num) : self.input_nodes[i].setInput(item[i]) item_output = item[-1] nn_output = self.output_node.getOutput() if (nn_output > 0.5 and item_output > 0.5) or (nn_output < 0.5 and item_output < 0.5) : cnt += 1 return cnt/(len(self.test_data) + 0.0)
主調(diào)流程
# -*- coding: utf-8 -*- """ Created on Sun Dec 2 14:49:31 2018 @author: congpeiqing """ import os from SimpleBP import SimpleBP from GenData import genData if not os.path.exists('data'): os.makedirs('data') #構(gòu)造訓(xùn)練和測試數(shù)據(jù) data_file = open('data/trainning_data.dat' , 'w') data_file.write(genData()) data_file.close() data_file = open('data/test_data.dat' , 'w') data_file.write(genData()) data_file.close() #文件格式:rec_id,attr1_value,attr2_value,attr3_value,class_id #讀取和解析訓(xùn)練數(shù)據(jù) trainning_data_file = open('data/trainning_data.dat') trainning_data = [] for line in trainning_data_file : line = line.strip() fld_list = line.split(',') trainning_data.append(tuple([float(field) for field in fld_list[1:]])) trainning_data_file.close() #讀取和解析測試數(shù)據(jù) test_data_file = open('data/test_data.dat') test_data = [] for line in test_data_file : line = line.strip() fld_list = line.split(',') test_data.append(tuple([float(field) for field in fld_list[1:]])) test_data_file.close() #構(gòu)造一個(gè)二分類網(wǎng)絡(luò) 輸入節(jié)點(diǎn)3個(gè),隱層節(jié)點(diǎn)10個(gè),輸出節(jié)點(diǎn)一個(gè) simple_bp = SimpleBP(3 , 10 , trainning_data , test_data) #訓(xùn)練網(wǎng)絡(luò) simple_bp.trainByItem() #測試分類準(zhǔn)確率 print('Accuracy : ' , simple_bp.getAccuracy()) #訓(xùn)練時(shí)長比較長,準(zhǔn)確率可以達(dá)到97%
看完了這篇文章,相信你對“python中實(shí)現(xiàn)BP神經(jīng)網(wǎng)絡(luò)原理是什么”有了一定的了解,如果想了解更多相關(guān)知識,歡迎關(guān)注創(chuàng)新互聯(lián)成都網(wǎng)站設(shè)計(jì)公司行業(yè)資訊頻道,感謝各位的閱讀!
另外有需要云服務(wù)器可以了解下創(chuàng)新互聯(lián)scvps.cn,海內(nèi)外云服務(wù)器15元起步,三天無理由+7*72小時(shí)售后在線,公司持有idc許可證,提供“云服務(wù)器、裸金屬服務(wù)器、高防服務(wù)器、香港服務(wù)器、美國服務(wù)器、虛擬主機(jī)、免備案服務(wù)器”等云主機(jī)租用服務(wù)以及企業(yè)上云的綜合解決方案,具有“安全穩(wěn)定、簡單易用、服務(wù)可用性高、性價(jià)比高”等特點(diǎn)與優(yōu)勢,專為企業(yè)上云打造定制,能夠滿足用戶豐富、多元化的應(yīng)用場景需求。