您好,登錄后才能下訂單哦!
這篇文章主要介紹了python人工神經網絡如何使用的相關知識,內容詳細易懂,操作簡單快捷,具有一定借鑒價值,相信大家閱讀完這篇python人工神經網絡如何使用文章都會有所收獲,下面我們一起來看看吧。
(Artificial Neural Network,ANN)是一種模仿生物神經網絡的結構和功能的數學模型,其目的是通過學習和訓練,在處理未知的輸入數據時能夠進行復雜的非線性映射關系,實現自適應的智能決策。可以說,ANN是人工智能算法中最基礎、最核心的一種算法。
ANN模型的基本結構包含輸入層、隱藏層和輸出層。輸入層接收輸入數據,隱藏層負責對數據進行多層次、高維度的變換和處理,輸出層對處理后的數據進行輸出。ANN的訓練過程是通過多次迭代,不斷調整神經網絡中各層的權重,從而使得神經網絡能夠對輸入數據進行正確的預測和分類。
接下來看看一個簡單的人工神經網絡算法示例:
import numpy as np class NeuralNetwork(): def __init__(self, layers): """ layers: 數組,包含每個層的神經元數量,例如 [2, 3, 1] 表示 3 層神經網絡,第一層 2 個神經元,第二層 3 個神經元,第三層 1 個神經元。 weights: 數組,包含每個連接的權重矩陣,默認值隨機生成。 biases: 數組,包含每個層的偏差值,默認值為 0。 """ self.layers = layers self.weights = [np.random.randn(a, b) for a, b in zip(layers[1:], layers[:-1])] self.biases = [np.zeros((a, 1)) for a in layers[1:]] def sigmoid(self, z): """Sigmoid 激活函數.""" return 1 / (1 + np.exp(-z)) def forward_propagation(self, a): """前向傳播.""" for w, b in zip(self.weights, self.biases): z = np.dot(w, a) + b a = self.sigmoid(z) return a def backward_propagation(self, x, y): """反向傳播.""" nabla_w = [np.zeros(w.shape) for w in self.weights] nabla_b = [np.zeros(b.shape) for b in self.biases] a = x activations = [x] zs = [] for w, b in zip(self.weights, self.biases): z = np.dot(w, a) + b zs.append(z) a = self.sigmoid(z) activations.append(a) delta = self.cost_derivative(activations[-1], y) * self.sigmoid_prime(zs[-1]) nabla_b[-1] = delta nabla_w[-1] = np.dot(delta, activations[-2].transpose()) for l in range(2, len(self.layers)): z = zs[-l] sp = self.sigmoid_prime(z) delta = np.dot(self.weights[-l+1].transpose(), delta) * sp nabla_b[-l] = delta nabla_w[-l] = np.dot(delta, activations[-l-1].transpose()) return (nabla_w, nabla_b) def train(self, x_train, y_train, epochs, learning_rate): """訓練網絡.""" for epoch in range(epochs): nabla_w = [np.zeros(w.shape) for w in self.weights] nabla_b = [np.zeros(b.shape) for b in self.biases] for x, y in zip(x_train, y_train): delta_nabla_w, delta_nabla_b = self.backward_propagation(np.array([x]).transpose(), np.array([y]).transpose()) nabla_w = [nw+dnw for nw, dnw in zip(nabla_w, delta_nabla_w)] nabla_b = [nb+dnb for nb, dnb in zip(nabla_b, delta_nabla_b)] self.weights = [w-(learning_rate/len(x_train))*nw for w, nw in zip(self.weights, nabla_w)] self.biases = [b-(learning_rate/len(x_train))*nb for b, nb in zip(self.biases, nabla_b)] def predict(self, x_test): """預測.""" y_predictions = [] for x in x_test: y_predictions.append(self.forward_propagation(np.array([x]).transpose())[0][0]) return y_predictions def cost_derivative(self, output_activations, y): """損失函數的導數.""" return output_activations - y def sigmoid_prime(self, z): """Sigmoid 函數的導數.""" return self.sigmoid(z) * (1 - self.sigmoid(z))
使用以下代碼示例來實例化和使用這個簡單的神經網絡類:
x_train = [[0, 0], [1, 0], [0, 1], [1, 1]] y_train = [0, 1, 1, 0] # 創建神經網絡 nn = NeuralNetwork([2, 3, 1]) # 訓練神經網絡 nn.train(x_train, y_train, 10000, 0.1) # 測試神經網絡 x_test = [[0, 0], [1, 0], [0, 1], [1, 1]] y_test = [0, 1, 1, 0] y_predictions = nn.predict(x_test) print("Predictions:", y_predictions) print("Actual:", y_test)
輸出結果:
Predictions: [0.011602156431658403, 0.9852717774725432, 0.9839448924887225, 0.020026540429992387]
Actual: [0, 1, 1, 0]
關于“python人工神經網絡如何使用”這篇文章的內容就介紹到這里,感謝各位的閱讀!相信大家對“python人工神經網絡如何使用”知識都有一定的了解,大家如果還想學習更多知識,歡迎關注億速云行業資訊頻道。
免責聲明:本站發布的內容(圖片、視頻和文字)以原創、轉載和分享為主,文章觀點不代表本網站立場,如果涉及侵權請聯系站長郵箱:is@yisu.com進行舉報,并提供相關證據,一經查實,將立刻刪除涉嫌侵權內容。