micrograd

What is micrograd?

micrograd is a simple implementation of pytorch-like autograd engine built by karparthy.

Test code


from micrograd.engine import Value

a = Value(4.0)
b = Value(3.0)
c = a * b
print(f'{c.data:.4f}')

c.backward()
print(f'{a.grad:.4f}')
print(f'{b.grad:.4f}')
print(f'{c.grad:.4f}')

The output is quite straightforward:

12.0000
3.0000
4.0000
1.0000

However, the grad will accumulate in the value if it’s not set to zero.

from micrograd.engine import Value

a = Value(4.0)
b = Value(3.0)
c = a * b
print(f'{c.data:.4f}')

c.backward()
print(f'{a.grad:.4f}')
print(f'{b.grad:.4f}')
print(f'{c.grad:.4f}')


# a = Value(4.0)
# b = Value(3.0)
d = a + b
d.backward()
print(f'{a.grad:.4f}')
print(f'{b.grad:.4f}')


OUtput:

12.0000
3.0000
4.0000
1.0000
4.0000
5.0000

Internal implementation

Please check code in this notebook micrograd note book

Some vector based tensor implementation based on micrograd deeplex ugrad




Enjoy Reading This Article?

Here are some more articles you might like to read next:

  • Learning-based memory allocation for C++ server workloads summary
  • my question:
  • Binary search algorithm variant
  • Docker Rocksdb build
  • Difference between Dockerfile and Docker Compose