What's next?
We completed this section by converting and optimizing the TensorFlow neural network for mobile devices using the TensorFlow Lite converter.
Neural Networks can be enormous and very computationally expensive, because of that most of architectures can’t be executed on a mobile device. To overcome this problem, we have the TensorFlow Lite - library used to optimize models in multiple ways so we can execute them on a mobile device or any other embedded device.
In this section, we have completed the side that any Deep Learning engineer would do - write a model, train it, and optimize it for a mobile device. From this point on, there is a mobile app building and creating a serving infrastructure on the side of a mobile device.
To give you a full perspective - I am providing a few additional links that cover the mobile side as well. After passing through this section and these links, you will have a full picture on how to put a neural network on a mobile device, either Android or iOS.
-
Putting a trained model on a mobile device:
https://medium.com/@rdeep/tensorflow-lite-tutorial-easy-implementation-in-android-145443ec3775 -
Putting a trained model on a mobile device advanced tutorial:
https://heartbeat.fritz.ai/neural-networks-on-mobile-devices-with-tensorflow-lite-a-tutorial-85b41f53230c -
TensorFlow Lite pre-trained models with applications:
https://www.tensorflow.org/lite/examples/ -
TensorFlow Lite tutorial and guide:
https://www.tensorflow.org/lite/guide