Disclaimer: I am not qualified to teach you. But while you're here, let's learn something together.
XOR or exclusive-OR is a logical operation that returns true if only one of the inputs is true. (only considering for two input combination)
https://en.wikipedia.org/wiki/Exclusive_or
Behold my bad handwriting : )
It's difficult to teach a computer XOR because the output cannot be separated by a straight line. However, now that we have the power of Machine Learning, we are going to do exactly that.
I really enjoy trying out different Neural Networks(NN) on https://playground.tensorflow.org
Here is the NN I use for XOR playground.tensorflow.org
(Find the full code on github )
Here we are building a 2 layer NN, 1 hidden layer with 4 units and an output layer with 1 unit (the output should be between 0 and 1)
Get tensorflow.js
Input :
Output:
We'll use a sequential model - that means the output of one layer is fed into the input for the next layer.
Add a fully connected hidden layer. It is mandatory to give the input shape. Our inputs will be two numbers between 0 and 1. (Fully connected- each input unit is mapped to each output unit). I'll probably write about neural networks, activations, weights, and biases soon (or not), till then follow the references.
The activation function sigmoid was giving much better results than relu :D
Add a fully connected output layer
Our model is ready! Time to compile.
Learning rate should be small, you can experiment with different values. 0.1 worked for me.
I was trying to make things work with sgd(stochastic gradient descent) optimizer, but adam worked so much better for XOR. It's a lot about hit and trial.
To fit the training data we train the model on our input multiple times. Ask the fit function to shuffle the input.
You'll notice the loss decreasing and the accuracy increasing after each round. That's it! You're done!
Input can be as simple as
Use the predict function of model to make predictions
I am using a canvas to depict my predictions for different inputs.
(I should write a blog on that alone because man, is that difficult!)
Find the full code on github
My canvas looked like this after 20 epochs
If you want to learn about TensorFlow APIs go to https://js.tensorflow.org/api/0.11.6/
Neural Networks can be learned from Coursera - https://www.coursera.org/learn/neural-networks-deep-learning (I loved these courses and would highly recommend to everyone)
What is XOR?
XOR or exclusive-OR is a logical operation that returns true if only one of the inputs is true. (only considering for two input combination)
https://en.wikipedia.org/wiki/Exclusive_or
Behold my bad handwriting : )
Visualize the Neural Network using TensorFlow Playground
I really enjoy trying out different Neural Networks(NN) on https://playground.tensorflow.org
Here is the NN I use for XOR playground.tensorflow.org
Bring out the power of Tensorflow.js
(Find the full code on github )
Here we are building a 2 layer NN, 1 hidden layer with 4 units and an output layer with 1 unit (the output should be between 0 and 1)
Get tensorflow.js
<script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs@0.11.6">
Input :
const train_xs = tf.tensor2d([
[0, 0],
[0, 1],
[1, 0],
[1, 1]
]);
Output:
const train_ys = tf.tensor2d([0, 1, 1, 0], [4, 1]);
These will stay the same :)We'll use a sequential model - that means the output of one layer is fed into the input for the next layer.
const model = tf.sequential();
Add a fully connected hidden layer. It is mandatory to give the input shape. Our inputs will be two numbers between 0 and 1. (Fully connected- each input unit is mapped to each output unit). I'll probably write about neural networks, activations, weights, and biases soon (or not), till then follow the references.
model.add(
tf.layers.dense({
inputShape: [2],
units: 4,
activation: "sigmoid",
useBias: true })
);
Add a fully connected output layer
model.add(
tf.layers.dense({
units: 1,
activation: "sigmoid" })
);
Our model is ready! Time to compile.
Learning rate should be small, you can experiment with different values. 0.1 worked for me.
const LEARNING_RATE = 0.1;
const optimizer = tf.train.adam(LEARNING_RATE);
model.compile({
optimizer: optimizer,
loss: tf.losses.meanSquaredError,
metrics: ["accuracy"]
});
I was trying to make things work with sgd(stochastic gradient descent) optimizer, but adam worked so much better for XOR. It's a lot about hit and trial.
To fit the training data we train the model on our input multiple times. Ask the fit function to shuffle the input.
You'll notice the loss decreasing and the accuracy increasing after each round. That's it! You're done!
Make predictions
var input = tf.tensor2d([[0, 1]]);
Use the predict function of model to make predictions
var output = model.predict(input);
console.log(output.dataSync());
I am using a canvas to depict my predictions for different inputs.
(I should write a blog on that alone because man, is that difficult!)
Find the full code on github
My canvas looked like this after 20 epochs
Inspiration
I was inspired to build a model for XOR on watching a coding challenge on 'The Coding Train' by Daniel Shiffman (https://twitter.com/shiffman)
While you're on twitter, here's my handle @iShivangiDas
Did not understand everything?
If you want to learn about TensorFlow APIs go to https://js.tensorflow.org/api/0.11.6/
Neural Networks can be learned from Coursera - https://www.coursera.org/learn/neural-networks-deep-learning (I loved these courses and would highly recommend to everyone)
No comments:
Post a Comment