Python Program to Implement the Backpropagation Algorithm Artificial Neural Network
Exp. No. 4. Build an Artificial Neural Network by implementing the Backpropagation algorithm and test the same using appropriate data sets.
Python Program to Implement and Demonstrate Backpropagation Algorithm Machine Learning
import numpy as np X = np.array(([2, 9], [1, 5], [3, 6]), dtype=float) y = np.array(([92], [86], [89]), dtype=float) X = X/np.amax(X,axis=0) #maximum of X array longitudinally y = y/100 #Sigmoid Function def sigmoid (x): return 1/(1 + np.exp(-x)) #Derivative of Sigmoid Function def derivatives_sigmoid(x): return x * (1 - x) #Variable initialization epoch=5 #Setting training iterations lr=0.1 #Setting learning rate inputlayer_neurons = 2 #number of features in data set hiddenlayer_neurons = 3 #number of hidden layers neurons output_neurons = 1 #number of neurons at output layer #weight and bias initialization wh=np.random.uniform(size=(inputlayer_neurons,hiddenlayer_neurons)) bh=np.random.uniform(size=(1,hiddenlayer_neurons)) wout=np.random.uniform(size=(hiddenlayer_neurons,output_neurons)) bout=np.random.uniform(size=(1,output_neurons)) #draws a random range of numbers uniformly of dim x*y for i in range(epoch): #Forward Propogation hinp1=np.dot(X,wh) hinp=hinp1 + bh hlayer_act = sigmoid(hinp) outinp1=np.dot(hlayer_act,wout) outinp= outinp1+bout output = sigmoid(outinp) #Backpropagation EO = y-output outgrad = derivatives_sigmoid(output) d_output = EO * outgrad EH = d_output.dot(wout.T) hiddengrad = derivatives_sigmoid(hlayer_act)#how much hidden layer wts contributed to error d_hiddenlayer = EH * hiddengrad wout += hlayer_act.T.dot(d_output) *lr # dotproduct of nextlayererror and currentlayerop wh += X.T.dot(d_hiddenlayer) *lr print ("-----------Epoch-", i+1, "Starts----------") print("Input: \n" + str(X)) print("Actual Output: \n" + str(y)) print("Predicted Output: \n" ,output) print ("-----------Epoch-", i+1, "Ends----------\n") print("Input: \n" + str(X)) print("Actual Output: \n" + str(y)) print("Predicted Output: \n" ,output)
Training Examples:
Example | Sleep | Study | Expected % in Exams |
1 | 2 | 9 | 92 |
2 | 1 | 5 | 86 |
3 | 3 | 6 | 89 |
Normalize the input
Example | Sleep | Study | Expected % in Exams |
1 | 2/3 = 0.66666667 | 9/9 = 1 | 0.92 |
2 | 1/3 = 0.33333333 | 5/9 = 0.55555556 | 0.86 |
3 | 3/3 = 1 | 6/9 = 0.66666667 | 0.89 |
Output
———–Epoch- 1 Starts———-
Input:
[[0.66666667 1. ]
[0.33333333 0.55555556]
[1. 0.66666667]]
Actual Output:
[[0.92]
[0.86]
[0.89]]
Predicted Output:
[[0.81951208]
[0.8007242 ]
[0.82485744]]
———–Epoch- 1 Ends———-
———–Epoch- 2 Starts———-
Input:
[[0.66666667 1. ]
[0.33333333 0.55555556]
[1. 0.66666667]]
Actual Output:
[[0.92]
[0.86]
[0.89]]
Predicted Output:
[[0.82033938]
[0.80153634]
[0.82568134]]
———–Epoch- 2 Ends———-
———–Epoch- 3 Starts———-
Input:
[[0.66666667 1. ]
[0.33333333 0.55555556]
[1. 0.66666667]]
Actual Output:
[[0.92]
[0.86]
[0.89]]
Predicted Output:
[[0.82115226]
[0.80233463]
[0.82649072]]
———–Epoch- 3 Ends———-
———–Epoch- 4 Starts———-
Input:
[[0.66666667 1. ]
[0.33333333 0.55555556]
[1. 0.66666667]]
Actual Output:
[[0.92]
[0.86]
[0.89]]
Predicted Output:
[[0.82195108]
[0.80311943]
[0.82728598]]
———–Epoch- 4 Ends———-
———–Epoch- 5 Starts———-
Input:
[[0.66666667 1. ]
[0.33333333 0.55555556]
[1. 0.66666667]]
Actual Output:
[[0.92]
[0.86]
[0.89]]
Predicted Output:
[[0.8227362 ]
[0.80389106]
[0.82806747]]
———–Epoch- 5 Ends———-
Input:
[[0.66666667 1. ]
[0.33333333 0.55555556]
[1. 0.66666667]]
Actual Output:
[[0.92]
[0.86]
[0.89]]
Predicted Output:
[[0.8227362 ]
[0.80389106]
[0.82806747]]
Summary
This tutorial discusses how to Implement and demonstrate the Backpropagation Algorithm in Python. If you like the tutorial share it with your friends. Like the Facebook page for regular updates and YouTube channel for video tutorials.