This is an implementation of deep neural networks using nothing but Python and NumPy. I've taken up this project to complement the Deep Learning Specialization, offered by Coursera and taught by Andrew Ng.
Currently, the following things are supported:
- Layers:
Dense
Conv2D
DepthwiseConv2D
SeparableConv2D
Conv2DTranspose
MaxPooling2D
AveragePooling2D
BatchNorm
Dropout
Flatten
Add
Concatenate
- Activations:
Linear
Sigmoid
Tanh
ReLU
LeakyReLU
ELU
Softmax
- Losses
BinaryCrossEntropy
CategoricalCrossEntropy
MeanSquaredError
- Optimizers:
- Vanilla
SGD
SGD
with momentumRMSProp
- Vanilla
Adam
Adam
with AMSGrad.
- Vanilla
- Learning Rate Decay
TimeDecay
ExponentialDecay
CosineDecay
It is also possible to easily add layers, activations, losses, optimizers and decay algorithms.
Note: There is no automatic differentiation. Users, when extending, need to define the necessary derivatives for backpropagation.
Hope you like it! Happy learning!