Open In App

Implementation of Perceptron Algorithm for AND Logic Gate with 2-bit Binary Input

Last Updated : 08 Jun, 2020
Improve
Improve
Like Article
Like
Save
Share
Report
In the field of Machine Learning, the Perceptron is a Supervised Learning Algorithm for binary classifiers. The Perceptron Model implements the following function:

    \[ \begin{array}{c} \hat{y}=\Theta\left(w_{1} x_{1}+w_{2} x_{2}+\ldots+w_{n} x_{n}+b\right) \\ =\Theta(\mathbf{w} \cdot \mathbf{x}+b) \\ \text { where } \Theta(v)=\left\{\begin{array}{cc} 1 & \text { if } v \geqslant 0 \\ 0 & \text { otherwise } \end{array}\right. \end{array} \]

For a particular choice of the weight vector $\boldsymbol{w}$ and bias parameter $\boldsymbol{b}$, the model predicts output $\boldsymbol{\hat{y}}$ for the corresponding input vector $\boldsymbol{x}$. AND logical function truth table for 2-bit binary variables, i.e, the input vector $\boldsymbol{x} : (\boldsymbol{x_{1}}, \boldsymbol{x_{2}})$ and the corresponding output $\boldsymbol{y}$
$\boldsymbol{x_{1}}$ $\boldsymbol{x_{2}}$ $\boldsymbol{y}$
0 0 0
0 1 0
1 0 0
1 1 1
Now for the corresponding weight vector $\boldsymbol{w} : (\boldsymbol{w_{1}}, \boldsymbol{w_{2}})$ of the input vector $\boldsymbol{x} : (\boldsymbol{x_{1}}, \boldsymbol{x_{2}})$, the associated Perceptron Function can be defined as:

    \[$\boldsymbol{\hat{y}} = \Theta\left(w_{1} x_{1}+w_{2} x_{2}+b\right)$\]

For the implementation, considered weight parameters are $\boldsymbol{w_{1}} = 1, \boldsymbol{w_{2}} = 1$ and the bias parameter is $\boldsymbol{b} = -1.5$. Python Implementation:
# importing Python library
import numpy as np
  
# define Unit Step Function
def unitStep(v):
    if v >= 0:
        return 1
    else:
        return 0
  
# design Perceptron Model
def perceptronModel(x, w, b):
    v = np.dot(w, x) + b
    y = unitStep(v)
    return y
  
# AND Logic Function
# w1 = 1, w2 = 1, b = -1.5
def AND_logicFunction(x):
    w = np.array([1, 1])
    b = -1.5
    return perceptronModel(x, w, b)
  
# testing the Perceptron Model
test1 = np.array([0, 1])
test2 = np.array([1, 1])
test3 = np.array([0, 0])
test4 = np.array([1, 0])
  
print("AND({}, {}) = {}".format(0, 1, AND_logicFunction(test1)))
print("AND({}, {}) = {}".format(1, 1, AND_logicFunction(test2)))
print("AND({}, {}) = {}".format(0, 0, AND_logicFunction(test3)))
print("AND({}, {}) = {}".format(1, 0, AND_logicFunction(test4)))

                    
Output:
AND(0, 1) = 0
AND(1, 1) = 1
AND(0, 0) = 0
AND(1, 0) = 0
Here, the model predicted output ($\boldsymbol{\hat{y}}$) for each of the test inputs are exactly matched with the AND logic gate conventional output ($\boldsymbol{y}$) according to the truth table for 2-bit binary input. Hence, it is verified that the perceptron algorithm for AND logic gate is correctly implemented.

Like Article
Suggest improvement
Share your thoughts in the comments

Similar Reads