# Implementation of Logic Gates using McCulloch-Pitts Model | Neural Networks

Hello everyone!! Welcome back to Neural

Network Lectures. In this lecture, we are going to discuss about how to implement various Logic Gates using McCulloch Pitts Model. Before going on to that I hope that

everyone is familiar with the nonlinear model of neuron. So let us, take a quick glance to it, okay? So, this is the nonlinear model of neuron where we have

the input nodes over here, that is X1 X2 etc.. to Xn. Then we have W11

W12 W13 and I hope that everyone is familiar with the notations. Then we have

a linear summariser of Vk which is given by Vk equal to sigma j equal to 1 to n

Xj Wkj and this goes into an activation function and we get Yk equal

to phi of Vk, okay? Now this model will become a McCulloch-Pitts model where we

give a bias to it. Let bias be theta K So this Vk ultimately becomes Vk

equal to j equal to 1 to n Xj Wkj plus theta K ,right? Therefore the output will

be Yk equal to Phi of Vk that is sigma j equal to 1 to n Xj Wkj plus

theta K, okay? And instead of leaving this theta K over here, I can give it as an

input right here, So that is I’ll give another source node and set it’s input as

X0 equal to 1. Then I join it to here then I’ll give the weight over here

as theta K, okay? Then another condition for this to be a McCulloch-Pitts model is

that the activation function should be threshold function. That is, Phi of Vk should be 1 when Vk take greater than or equal to 0 and 0 then Vk less

than 0. I hope that everyone is familiar with this. Now we’ll try to implement the

logic AND gate, okay? So first we need to draw the Truth Table for AND gate. That is, A be y 0 0 0 1 1 0 1 1 and outputs are 0 0 0 1

right? so we know that Vk equal to Sigma Xj Wkj plus theta K right? But in this particular case we have only two inputs so this n equal

to 2 This becomes X1 W11 plus X2 W12 plus theta 1, ok? And you have we also

have only one output neuron therefore k equal to 1 and this is X1 this is X2

and this is Y1. According to threshold function we know that Phi of Vk,ie Yk, will be one when Vk is greater than 0 and 0

when Vk is less than 0. So what we need to do is we need to adjust the bias theta1

in such a way that only this case Vk will be greater than 0 or equal

to 0 and in all these cases Vk should be less than 0. So let us try to

implement that.. So let us take this row that is row 1 that. Before that, let us, for

initialization take W11 equal to 1 W12 equal to 1 and theta K equal to minus 1

theta 1 equal to minus 1. This is random assumption and you can take whatever values you like, okay? But it will be easy if we take our w11 equal to 1 W12 equal to

1 and theta.. you’ll see, okay? And now we can see V1 equal to x1, that is 0 in

this case, okay? 0 x 1 plus X2 that is 0 x 1 plus theta K that is

1 and this is equal to 1 which is greater than 0. Therefore we got V1

greater than 0 which means output will be 1. But we need the output to be 0. Therefore we need to update the value of theta1 therefore let us take theta1

equal to minus 1. So V1 will be, in similar fashion, f0 x 1 plus 0 x 1 plus minus 1. So this is equal to minus 1 and less than 0. Therefore V1

## 8 Replies to “Implementation of Logic Gates using McCulloch-Pitts Model | Neural Networks”

Good lectures sir.. Keep going.. I am from NITC.. Its very useful

Awesome Sir….Please upload videos on Perceptron and other ANN Modules

good lecture

Thank god i found this

wonderful!!

this is solid explananation..thanks bro

Dear sir/madam

Which tool Ur using for presentation

Amazing !!!!!