from nanograd.nn import MLP, plot_preds
import numpy as np
import math
nanograd
This is a mock library implementing
micrograd
by Andrej Karpathy.
Install
git clone https://github.com/vempaliakhil96/nanograd.git && cd nanaograd
pip install .
How to use
Let’s generate a random dataset
Generating few samples of a sine wave
= [(i, 1 if math.sin(i)>0 else -1) for i in np.arange(-5, 5, 0.1)]
samples
= zip(*samples)
xs, ys = list(xs), list(ys) xs, ys
Initialise MLP
= MLP(1, [4, 4, 1])
model model.draw_nn()
Perdictions before training
= [model([x])[0] for x in xs]
y_preds for i in y_preds]) plot_preds(xs, ys, [i.data
Training loop
def calc_loss_and_accuracy(model, X, Y):
= [model([x])[0] for x in X]
y_preds = sum([(y_preds[i] - Y[i])**2 for i in range(len(Y))])/len(Y) # MSE
loss = [1 if y_preds[i].data > 0.5 else 0 for i in range(len(Y))]
y_preds = sum([1 if y_preds[i] == Y[i] else 0 for i in range(len(Y))])/len(Y)
accuracy return loss, accuracy
calc_loss_and_accuracy(model, xs, ys)
(Value(data=1.23e+00, grad=0.00e+00, label=), 0.0)
for i in range(1000):
# forward pass
= calc_loss_and_accuracy(model, xs, ys)
loss, accuracy
# backward pass
model.zero_grad()
loss.backward()
# update weights
for p in model.parameters():
+= -0.1 * p.grad
p.data
if i % 100 == 0:
print(f"Loss at epoch {i:.3f}: {loss.data:.3f} | Accuracy: {accuracy:.3f}")
Loss at epoch 0.000: 1.231 | Accuracy: 0.000
Loss at epoch 100.000: 0.554 | Accuracy: 0.220
Loss at epoch 200.000: 0.260 | Accuracy: 0.370
Loss at epoch 300.000: 0.147 | Accuracy: 0.420
Loss at epoch 400.000: 0.117 | Accuracy: 0.440
Loss at epoch 500.000: 0.099 | Accuracy: 0.450
Loss at epoch 600.000: 0.087 | Accuracy: 0.450
Loss at epoch 700.000: 0.079 | Accuracy: 0.470
Loss at epoch 800.000: 0.072 | Accuracy: 0.470
Loss at epoch 900.000: 0.067 | Accuracy: 0.470
Predictions
= [model([x])[0] for x in xs]
y_preds for i in y_preds]) plot_preds(xs, ys, [i.data
Notice the strength of the connections has changed.
model.draw_nn()