(Because it's from scratch)
This project, inspired by Andrej Karpathy's micrograd and YouTube series, features an autograd implementation in plain Typescript with no dependencies.
- [x] Refactor the model API towards pytorch's (explicitly passing in layers, optimizer object...)
- [x] Export autograd + MLP wrapper as a package
- Determine fix for adding more than one layer - currently multiple layers result in exploding gradients
- [x] Implement Batch Normalization
- [ ] Ensure gradients are healthy looking
- [ ] Add Embedding layer