A simple library for feed-forward neural network as an exercise for the Coursera Deep Learning course.
Usage example can be found in the example1.py and example2.py
Currently supports:
-
Layers
- Fully Connected
- Dropout
- Softmax
-
Activations
- Sigmoid
- Tanh
- Relu
- Leaky Relu
- Linear
-
Loss Functions
- Cross Entropy
- Binary Entropy
- MSE
- Softmax + Cross Entropy
-
Optimizer
- Gradient Descent Optimizer
- Adam Optimizer
-
Other
- L2 Regularization
Future Work:
- Add Convolutional Layer