In the field of Machine Learning, the Perceptron is a Supervised Learning Algorithm for binary classifiers. The Perceptron Model implements the following function:
![Rendered by QuickLaTeX.com \[ \begin{array}{c} \hat{y}=\Theta\left(w_{1} x_{1}+w_{2} x_{2}+\ldots+w_{n} x_{n}+b\right) \\ =\Theta(\mathbf{w} \cdot \mathbf{x}+b) \\ \text { where } \Theta(v)=\left\{\begin{array}{cc} 1 & \text { if } v \geqslant 0 \\ 0 & \text { otherwise } \end{array}\right. \end{array} \]](https://www.geeksforgeeks.org/wp-content/ql-cache/quicklatex.com-f98cbf4744582c2b3309f1b0ceb8a313_l3.png)
For a particular choice of the weight vector
and bias parameter
, the model predicts output
for the corresponding input vector
.
OR logical function truth table for 2-bit binary variables, i.e, the input vector
and the corresponding output
–
Now for the corresponding weight vector
of the input vector
, the associated Perceptron Function can be defined as:
![Rendered by QuickLaTeX.com \[$\boldsymbol{\hat{y}} = \Theta\left(w_{1} x_{1}+w_{2} x_{2}+b\right)$\]](https://www.geeksforgeeks.org/wp-content/ql-cache/quicklatex.com-9028cdb4e1fc09398f4e0b8ab8c8a7cc_l3.png)

For the implementation, considered weight parameters are
and the bias parameter is
.
Python Implementation:
import numpy as np
def unitStep(v):
if v > = 0 :
return 1
else :
return 0
def perceptronModel(x, w, b):
v = np.dot(w, x) + b
y = unitStep(v)
return y
def OR_logicFunction(x):
w = np.array([ 1 , 1 ])
b = - 0.5
return perceptronModel(x, w, b)
test1 = np.array([ 0 , 1 ])
test2 = np.array([ 1 , 1 ])
test3 = np.array([ 0 , 0 ])
test4 = np.array([ 1 , 0 ])
print ( "OR({}, {}) = {}" . format ( 0 , 1 , OR_logicFunction(test1)))
print ( "OR({}, {}) = {}" . format ( 1 , 1 , OR_logicFunction(test2)))
print ( "OR({}, {}) = {}" . format ( 0 , 0 , OR_logicFunction(test3)))
print ( "OR({}, {}) = {}" . format ( 1 , 0 , OR_logicFunction(test4)))
|
Output:
OR(0, 1) = 1
OR(1, 1) = 1
OR(0, 0) = 0
OR(1, 0) = 1
Here, the model predicted output (
) for each of the test inputs are exactly matched with the OR logic gate conventional output (
) according to the truth table for 2-bit binary input.
Hence, it is verified that the perceptron algorithm for OR logic gate is correctly implemented.
Whether you're preparing for your first job interview or aiming to upskill in this ever-evolving tech landscape,
GeeksforGeeks Courses are your key to success. We provide top-quality content at affordable prices, all geared towards accelerating your growth in a time-bound manner. Join the millions we've already empowered, and we're here to do the same for you. Don't miss out -
check it out now!