Neural Network is conceptually based on actual neuron of brain. Neurons are the basic units of a large neural network. A single neuron passes single forward based on input provided.
In Neural network, some inputs are provided to an artificial neuron, and with each input a weight is associated. Weight increases the steepness of activation function. This means weight decide how fast the activation function will trigger whereas bias is used to delay the triggering of the activation function.
For a typical neuron, if the inputs are x1, x2, and x3, then the synaptic weights to be applied to them are denoted as w1, w2, and w3.
y = f(x) = Σxiwi
where i is 1 to the number of inputs.
The weight shows the effectiveness of a particular input. More the weight of input, more it will have impact on network.
On the other hand Bias is like the intercept added in a linear equation. It is an additional parameter in the Neural Network which is used to adjust the output along with the weighted sum of the inputs to the neuron. Therefore Bias is a constant which helps the model in a way that it can fit best for the given data.
The processing done by a neuron is thus denoted as :
output = sum (weights * inputs) + bias
Need of bias
In above figure
y = mx+c
m = weight
c = bias
Now, Suppose if c was absent, then the graph will be formed like this:
Due to absence of bias, model will train over point passing through origin only, which is not in accordance with real-world scenario. Also with the introduction of bias, the model will become more flexible.
Suppose an activation function act() which get triggered on some input greater than 0.
input1 = 1
weight1 = 2
input2 = 2
weight2 = 2
output = input1*weight1 + input2*weight2
output = 6
suppose act(output) = 1
Now a bias is introduced in output as
bias = -6
the output become 0.
act(0) = 0
so activation function will not trigger.
Change in weight
Here in graph, as it can be seen that when:
- weight WI
changed from 1.0 to 4.0
- weight W2
changed from -0.5 to 1.5
On increasing the weight the steepness is increasing.
Therefore it can be inferred that
More the weight earlier activation function will trigger.
Change in bias
Here in graph below, when
Bias changed from -1.0 to -5.0
The change in bias is increasing the value of triggering activation function.
Therefore it can be inferred that from above graph that,
bias helps in controlling the value at which activation function will trigger.
- Computer Network | Backing up Cisco IOS router image
- Computer Network | Service Set Identifier (SSID)
- Computer Network | Wifi protected setup (WPS)
- Computer Network | Wifi protected access (WPA)
- Computer Network | Voice over Internet Protocol (VoIP)
- Introduction of Graduate Record Examinations (GRE)
- How to Prepare a Word List for the GRE General Test
- Best College Prediction based on GATE CS All India Rank (AIR)/Score
- Crypto: The bubble that up and burst
- Edge Computing
- Probability and Statistics | Simpson's Paradox (UC Berkeley's Lawsuit)
- Ethical Hacking | Footprinting
- Impact going to be created by Augmented Reality and Virtual reality
- LongAccumulator getThenReset() method in Java with Examples
If you like GeeksforGeeks and would like to contribute, you can also write an article using contribute.geeksforgeeks.org or mail your article to email@example.com. See your article appearing on the GeeksforGeeks main page and help other Geeks.
Please Improve this article if you find anything incorrect by clicking on the "Improve Article" button below.