Open In App

What is SoftmaxLayer in PyBrain?

Last Updated : 21 Feb, 2022
Improve
Improve
Like Article
Like
Save
Share
Report

SoftmaxLayer executes the softmax distribution from the given input dataset. We can build the network with input, hidden, and output layers using buildNetwork() function and we have used the hidden class as SoftmaxLayer to check the AND and NOR table values of the dataset.  Below is the syntax to import SoftmaxLayer and usage in the code.

Syntax: 

Import SoftmaxLayer: from pybrain.structure import SoftmaxLayer

Usage in python code: net= buildNetwork(1, 2, 1, bias=True, hiddenclass=SoftmaxLayer)

Example 1: 

  • In this example, we import the SoftmaxLayer using the import command to create the network using buildNetwork() with input, hidden, and output layer. W
  • e take a hidden class as SoftmaxLayer, Now give the sizes of input and output dataset using SupervisedDataSet(). 
  • To add sample dataset to AND table and NOR table. 
  • Then train this network using BackpropTrainer(). 
  • We have 2500 iterations and then testing starts and we can see the errors, corrections, max, errors, etc.
  • In this case, the sample data we have taken in AND table are ((0,0), (1,)) and ((0,1),(0,)) and NOR table are ((0,0),(1,)) and (0,1),(0,))

Python3




from pybrain.tools.shortcuts import buildNetwork
from pybrain.structure import SoftmaxLayer
from pybrain.datasets import SupervisedDataSet
from pybrain.supervised.trainers import BackpropTrainer
  
# two inputs, two hidden, and single output with hiddenlayer as Softmaxlayer
net = buildNetwork(2, 3, 1, bias=True, hiddenclass=SoftmaxLayer)
  
# size of inputs and outputs
gate_set = SupervisedDataSet(2, 1)
test_dataset = SupervisedDataSet(2, 1)
  
# AND truth table
gate_set.addSample((0, 0), (1,))
gate_set.addSample((0, 1), (0,))
  
# NOR truth table
test_dataset.addSample((0, 0), (1,))
test_dataset.addSample((0, 1), (0,))
  
#Train the network using net and gate_set.
backpr_tr = BackpropTrainer(net, gate_set)
  
# 2500 iteration
for i in range(1500):
      backpr_tr.train()
      
# Testing....
backpr_tr.testOnData(dataset=test_dataset, verbose = True)


Output:

Example 2:

In this example, we have taken the sample dataset in AND table are ((0,0), (0,)) and ((0,1),(1,)) and NOR table are ((0,0),(1,)) and (0,1),(1,)) and then training starts to train this network using 2500 iterations and finally testing starts. We can see the testing output with average errors, max errors, median errors, etc.

Python3




from pybrain.tools.shortcuts import buildNetwork
from pybrain.structure import SoftmaxLayer
from pybrain.datasets import SupervisedDataSet
from pybrain.supervised.trainers import BackpropTrainer
  
# two inputs, two hidden, and single output 
# with hiddenlayer as Softmaxlayer
net = buildNetwork(2, 3, 1, bias=True, hiddenclass=SoftmaxLayer)
  
# size of inputs and outputs
gate_set = SupervisedDataSet(2, 1)
test_dataset = SupervisedDataSet(2, 1)
  
# AND truth table
gate_set.addSample((0, 0), (0,))
gate_set.addSample((0, 1), (1,))
  
# NOR truth table
test_dataset.addSample((0, 0), (1,))
test_dataset.addSample((0, 1), (1,))
  
#Train the network using net and gate_set.
backpr_tr = BackpropTrainer(net, gate_set)
  
# 2500 iteration
for i in range(1500):
      backpr_tr.train()
      
# Testing....
backpr_tr.testOnData(dataset=test_dataset, verbose = True)


Output:



Like Article
Suggest improvement
Share your thoughts in the comments

Similar Reads