In the field of Machine Learning, the Perceptron is a Supervised Learning Algorithm for binary classifiers. The Perceptron Model implements the following function:

For a particular choice of the weight vector and bias parameter , the model predicts output for the corresponding input vector .

**NOR** logical function truth table for * 2-bit binary variables*, i.e, the input vector and the corresponding output –

0 | 0 | 1 |

0 | 1 | 0 |

1 | 0 | 0 |

1 | 1 | 0 |

We can observe that,

Now for the corresponding weight vector of the input vector to the OR node, the associated Perceptron Function can be defined as:

Later on, the output of OR node is the input to the NOT node with weight . Then the corresponding output is the final output of the NOR logic function and the associated Perceptron Function can be defined as:

For the implementation, considered weight parameters are and the bias parameters are .

**Python Implementation:**

`# importing Python library` `import` `numpy as np` ` ` `# define Unit Step Function` `def` `unitStep(v):` ` ` `if` `v >` `=` `0` `:` ` ` `return` `1` ` ` `else` `:` ` ` `return` `0` ` ` `# design Perceptron Model` `def` `perceptronModel(x, w, b):` ` ` `v ` `=` `np.dot(w, x) ` `+` `b` ` ` `y ` `=` `unitStep(v)` ` ` `return` `y` ` ` `# NOT Logic Function` `# wNOT = -1, bNOT = 0.5` `def` `NOT_logicFunction(x):` ` ` `wNOT ` `=` `-` `1` ` ` `bNOT ` `=` `0.5` ` ` `return` `perceptronModel(x, wNOT, bNOT)` ` ` `# OR Logic Function` `# w1 = 1, w2 = 1, bOR = -0.5` `def` `OR_logicFunction(x):` ` ` `w ` `=` `np.array([` `1` `, ` `1` `])` ` ` `bOR ` `=` `-` `0.5` ` ` `return` `perceptronModel(x, w, bOR)` ` ` `# NOR Logic Function` `# with OR and NOT ` `# function calls in sequence` `def` `NOR_logicFunction(x):` ` ` `output_OR ` `=` `OR_logicFunction(x)` ` ` `output_NOT ` `=` `NOT_logicFunction(output_OR)` ` ` `return` `output_NOT` ` ` `# testing the Perceptron Model` `test1 ` `=` `np.array([` `0` `, ` `1` `])` `test2 ` `=` `np.array([` `1` `, ` `1` `])` `test3 ` `=` `np.array([` `0` `, ` `0` `])` `test4 ` `=` `np.array([` `1` `, ` `0` `])` ` ` `print` `(` `"NOR({}, {}) = {}"` `.` `format` `(` `0` `, ` `1` `, NOR_logicFunction(test1)))` `print` `(` `"NOR({}, {}) = {}"` `.` `format` `(` `1` `, ` `1` `, NOR_logicFunction(test2)))` `print` `(` `"NOR({}, {}) = {}"` `.` `format` `(` `0` `, ` `0` `, NOR_logicFunction(test3)))` `print` `(` `"NOR({}, {}) = {}"` `.` `format` `(` `1` `, ` `0` `, NOR_logicFunction(test4)))` |

**Output:**

NOR(0, 1) = 0 NOR(1, 1) = 0 NOR(0, 0) = 1 NOR(1, 0) = 0

Here, the model predicted output () for each of the test inputs are exactly matched with the NOR logic gate conventional output () according to the truth table for 2-bit binary input.

Hence, it is verified that the perceptron algorithm for NOR logic gate is correctly implemented.