This repo contains a collection of digital hardware as IP cores that perform neural computations. These IP cores have been generated with HLS technics and are based on Keras generated neural networks.
For every model, there is an include/ folder, which contains headers files. There is a header for each of the layers, weights, and biases of the model as well as an input/s sample. These files have been used to verify the HW implementations.
Until now, the models uploaded have one or two inputs of 32x32 floats and 100 outputs. The operations are performed with floats of 32 bit in the digital HW. The layers of the models are of the types "GaussianNoise", "Dropout", "Activation", "Dense", "Flatten", "Concatenate", "MaxPooling2D", "Conv2D" and "InputLayer". Since "GaussianNoise", "Dropout" layers are only used for training purposes, they are not included in the HW.
The IPs have a few digital ports. The most important being the control port (AXI4-Lite) and the data port maxi_commns (AXI4 master).
- With python load and train the Keras model with your datasheet:
model = load_model("myModel.h5")
history = model.fit(x_train, y_train, batch_size=batch_size,
epochs=epochs, verbose=1, validation_data=(x_test, y_test))
- With Vivado:
- Create the project
- Import the Neural Network IP
- Connect buses
- Configure addreses
- Synthesize
Examples:
- Keras model implementation and training
- Vivado project to include a Neural Network
- User Space App to configure the NN
- No driver, no DMA, PYNQ framework Neural Network accesing
Compare the output of the digital circuit against the calculated values in python (given in the headers at include/). If the last layer has a softmax activation function, the value of the output layer will not match the value in the header; it has to be normalized.
- Sergio Rivera - Initial work - srivera1
This project is licensed under the GNU General Public License v3.0 - see the LICENSE file for details
- The template for this Readme comes from PurpleBooth