Implementation of Perceptron Algorithm for OR Logic Gate with 2-bit Binary Input
Last Updated :
08 Jun, 2020
In the field of Machine Learning, the Perceptron is a Supervised Learning Algorithm for binary classifiers. The Perceptron Model implements the following function:
For a particular choice of the weight vector
and bias parameter
, the model predicts output
for the corresponding input vector
.
OR logical function truth table for
2-bit binary variables, i.e, the input vector
and the corresponding output
–
Now for the corresponding weight vector
of the input vector
, the associated Perceptron Function can be defined as:
For the implementation, considered weight parameters are
and the bias parameter is
.
Python Implementation:
import numpy as np
def unitStep(v):
if v > = 0 :
return 1
else :
return 0
def perceptronModel(x, w, b):
v = np.dot(w, x) + b
y = unitStep(v)
return y
def OR_logicFunction(x):
w = np.array([ 1 , 1 ])
b = - 0.5
return perceptronModel(x, w, b)
test1 = np.array([ 0 , 1 ])
test2 = np.array([ 1 , 1 ])
test3 = np.array([ 0 , 0 ])
test4 = np.array([ 1 , 0 ])
print ( "OR({}, {}) = {}" . format ( 0 , 1 , OR_logicFunction(test1)))
print ( "OR({}, {}) = {}" . format ( 1 , 1 , OR_logicFunction(test2)))
print ( "OR({}, {}) = {}" . format ( 0 , 0 , OR_logicFunction(test3)))
print ( "OR({}, {}) = {}" . format ( 1 , 0 , OR_logicFunction(test4)))
|
Output:
OR(0, 1) = 1
OR(1, 1) = 1
OR(0, 0) = 0
OR(1, 0) = 1
Here, the model predicted output (
) for each of the test inputs are exactly matched with the OR logic gate conventional output (
) according to the truth table for 2-bit binary input.
Hence, it is verified that the perceptron algorithm for OR logic gate is correctly implemented.
Share your thoughts in the comments
Please Login to comment...