Open In App

Perceptron Algorithm for Logic Gate with 3-bit Binary Input

Improve
Improve
Like Article
Like
Save
Share
Report
In the field of Machine Learning, the Perceptron is a Supervised Learning Algorithm for binary classifiers. The Perceptron Model implements the following function:

    \[\begin{array}{c}\hat{y}=\Theta\left(w_{1} x_{1}+w_{2} x_{2}+\ldots+w_{n} x_{n}+b\right) \\ =\Theta(\mathbf{w} \cdot \mathbf{x}+b) \\  \text { where } \Theta(v)=\left\{\begin{array}{cc} 1 & \text { if } v \geqslant 0 \\ 0 & \text { otherwise } \end{array}\right. \end{array}\]

For a particular choice of the weight vector $\boldsymbol{w}$ and bias parameter $\boldsymbol{b}$, the model predicts output $\boldsymbol{\hat{y}}$ for the corresponding input vector $\boldsymbol{x}$. The logical function truth table of AND, OR, NAND, NOR gates for 3-bit binary variables, i.e, the input vector $\boldsymbol{x} : (\boldsymbol{x_{1}}, \boldsymbol{x_{2}}, \boldsymbol{x_{3}})$ and the corresponding output $\boldsymbol{y_{AND}}, \boldsymbol{y_{OR}}, \boldsymbol{y_{NAND}}, \boldsymbol{y_{NOR}}$
$\boldsymbol{x_{1}}$ $\boldsymbol{x_{2}}$ $\boldsymbol{x_{3}}$ $\boldsymbol{y_{AND}}$ $\boldsymbol{y_{OR}}$ $\boldsymbol{y_{NAND}}$ $\boldsymbol{y_{NOR}}$
0 0 0 0 0 1 1
0 0 1 0 1 1 0
0 1 0 0 1 1 0
0 1 1 0 1 1 0
1 0 0 0 1 1 0
1 0 1 0 1 1 0
1 1 0 0 1 1 0
1 1 1 1 1 0 0
Now for the corresponding weight vector $\boldsymbol{w} : (\boldsymbol{w_{1}}, \boldsymbol{w_{2}}, \boldsymbol{w_{3}})$ of the input vector $\boldsymbol{x} : (\boldsymbol{x_{1}}, \boldsymbol{x_{2}}, \boldsymbol{x_{3}})$, the associated Perceptron Function can be defined as:

    \[$\boldsymbol{\hat{y}} = \Theta\left(w_{1} x_{1}+w_{2} x_{2}+w_{3} x_{3}+b\right)$\]

For the implementation, considered weight parameters are $\boldsymbol{w_{1}}, \boldsymbol{w_{2}}, \boldsymbol{w_{3}}$ and the bias parameter is $\boldsymbol{b}$ for every logic gates-
$\boldsymbol{Parameters}$ $\boldsymbol{AND}$ $\boldsymbol{OR}$ $\boldsymbol{NAND}$ $\boldsymbol{NOR}$
$\boldsymbol{w_{1}}$ 1 1 -1 -1
$\boldsymbol{w_{2}}$ 1 1 -1 -1
$\boldsymbol{w_{2}}$ 1 1 -1 -1
$\boldsymbol{b}$ -2 -0.9 3 1
Python Implementation:
# importing python library
import numpy as np
  
# sigmoid activation function
def activationFunction(model, type ="sigmoid"):
   return {
       "sigmoid": 1 / (1 + np.exp(-model))
   }[type]
  
# designing perceptron model
def perceptronModel(weights, inputs, bias):
   model = np.add(np.dot(inputs, weights), bias)
   logic = activationFunction(model, type ="sigmoid")
   return np.round(logic)
  
# computation model
def compute(data, logicGate, weights, bias):
   weights = np.array(weights)
   output = np.array([ perceptronModel(weights,  
            datum, bias) for datum in data ])
   return output
  
# Print Output
def printOutput(dataset, name, data):
   print("Logic Function: {}".format(name.upper()))
   print("X1\tX2\tX3\tY")
   toPrint = ["{1}\t{2}\t{3}\t{0}".format(output, *datas)  
              for datas, output in zip(dataset, data)]
   for i in toPrint:
       print(i)
  
# main function
def main():
   # 3 bit binary data
   dataset = np.array([
     [0, 0, 0],
     [0, 0, 1],
     [0, 1, 0],
     [0, 1, 1],
     [1, 0, 0],
     [1, 0, 1],
     [1, 1, 0],
     [1, 1, 1]
   ])
  
   # Parameters of every Logic Gates
   # weight parameters: w1, w2, w3
   # bias parameter: b
   logicGate = {
       "and": compute(dataset, "and", [1, 1, 1], -2),
       "or": compute(dataset, "or", [1, 1, 1], -0.9),
       "nand": compute(dataset, "nand", [-1, -1, -1], 3),
       "nor": compute(dataset, "nor", [-1, -1, -1], 1)
   }
   for gate in logicGate:
       printOutput(dataset, gate, logicGate[gate])
  
if __name__ == '__main__':
   main()

                    
Output:
Logic Function: AND
X1    X2    X3    Y
0    0    0    0.0
0    0    1    0.0
0    1    0    0.0
0    1    1    0.0
1    0    0    0.0
1    0    1    0.0
1    1    0    0.0
1    1    1    1.0
Logic Function: OR
X1    X2    X3    Y
0    0    0    0.0
0    0    1    1.0
0    1    0    1.0
0    1    1    1.0
1    0    0    1.0
1    0    1    1.0
1    1    0    1.0
1    1    1    1.0
Logic Function: NAND
X1    X2    X3    Y
0    0    0    1.0
0    0    1    1.0
0    1    0    1.0
0    1    1    1.0
1    0    0    1.0
1    0    1    1.0
1    1    0    1.0
1    1    1    0.0
Logic Function: NOR
X1    X2    X3    Y
0    0    0    1.0
0    0    1    0.0
0    1    0    0.0
0    1    1    0.0
1    0    0    0.0
1    0    1    0.0
1    1    0    0.0
1    1    1    0.0
Here, the model predicted output ($\boldsymbol{\hat{y}}$) for each of the test inputs are exactly matched with the AND, OR, NAND, NOR logic gates conventional output ($\boldsymbol{y}$)s according to the truth table for 3-bit binary input. Hence, it is verified that the perceptron algorithm for all these logic gates is correctly implemented.

Last Updated : 18 Aug, 2020
Like Article
Save Article
Previous
Next
Share your thoughts in the comments
Similar Reads