Open In App
Related Articles

Implementation of Perceptron Algorithm for NOT Logic Gate

Improve Article
Improve
Save Article
Save
Like Article
Like


In the field of Machine Learning, the Perceptron is a Supervised Learning Algorithm for binary classifiers. The Perceptron Model implements the following function:

    \[ \begin{array}{c} \hat{y}=\Theta\left(w_{1} x_{1}+w_{2} x_{2}+\ldots+w_{n} x_{n}+b\right) \\ =\Theta(\mathbf{w} \cdot \mathbf{x}+b) \\ \text { where } \Theta(v)=\left\{\begin{array}{cc} 1 & \text { if } v \geqslant 0 \\ 0 & \text { otherwise } \end{array}\right. \end{array} \]

For a particular choice of the weight vector $\boldsymbol{w}$ and bias parameter $\boldsymbol{b}$, the model predicts output $\boldsymbol{\hat{y}}$ for the corresponding input vector $\boldsymbol{x}$.

NOT logical function truth table is of only 1-bit binary input (0 or 1), i.e, the input vector $\boldsymbol{x}$ and the corresponding output $\boldsymbol{y}$

$\boldsymbol{x}$ $\boldsymbol{y}$
0 1
1 0

Now for the corresponding weight vector $\boldsymbol{w}$ of the input vector $\boldsymbol{x}$, the associated Perceptron Function can be defined as:

    \[$\boldsymbol{\hat{y}} = \Theta\left(w x+b\right)$\]


For the implementation, considered weight parameter is $\boldsymbol{w} = -1$ and the bias parameter is $\boldsymbol{b} = 0.5$.

Python Implementation:




# importing Python library
import numpy as np
  
# define Unit Step Function
def unitStep(v):
    if v >= 0:
        return 1
    else:
        return 0
  
# design Perceptron Model
def perceptronModel(x, w, b):
    v = np.dot(w, x) + b
    y = unitStep(v)
    return y
  
# NOT Logic Function
# w = -1, b = 0.5
def NOT_logicFunction(x):
    w = -1
    b = 0.5
    return perceptronModel(x, w, b)
  
# testing the Perceptron Model
test1 = np.array(1)
test2 = np.array(0)
  
print("NOT({}) = {}".format(1, NOT_logicFunction(test1)))
print("NOT({}) = {}".format(0, NOT_logicFunction(test2)))


Output:

NOT(1) = 0
NOT(0) = 1

Here, the model predicted output ($\boldsymbol{\hat{y}}$) for each of the test inputs are exactly matched with the NOT logic gate conventional output ($\boldsymbol{y}$) according to the truth table.
Hence, it is verified that the perceptron algorithm for NOT logic gate is correctly implemented.


Whether you're preparing for your first job interview or aiming to upskill in this ever-evolving tech landscape, GeeksforGeeks Courses are your key to success. We provide top-quality content at affordable prices, all geared towards accelerating your growth in a time-bound manner. Join the millions we've already empowered, and we're here to do the same for you. Don't miss out - check it out now!

Last Updated : 08 Jun, 2020
Like Article
Save Article
Previous
Next
Similar Reads
Complete Tutorials