Neural Networks and optimizers from scratch in NumPy, featuring newer optimizers such as DemonAdam or QHAdam.
numpy
adadelta
neural-networks
mnist-dataset
backpropagation
rmsprop
nesterov-accelerated-sgd
neural-networks-from-scratch
optimizers
qhadam
demon-adam
-
Updated
Feb 10, 2021 - Jupyter Notebook