It is no secret, TensorFlow Lite is a deep learning software stack for edge devices inference.
What does it do?
- First, choose a TensorFlow model. According to TensorFlow for mobile and IoT, “TensorFlow model is a data structure that contains the logic and knowledge of a machine learning network trained to solve a particular problem.” You can train your own or use a pre-trained one. Some pre-trained options are image classification, object detection, smart reply, pose estimation and segmentation. You have the option to retrain a model.
- Although edge devices are low in memory with processing, power-consumption, and storage limitations, with converting your model to TensorFlow Lite, you can run the model efficiently on an embedded device. What this means is that every model before going to the edge needs to be converted. The converter supports Keras and Python. In fact, Python is recommended. Read more on this here. If you want to optimize the model more, don’t forget the optimization toolkit. To use tools properly it is recommended to check if the model is in the list given above. Otherwise start with post-training quantization tool.
- Next start inference. The good news is if your device is using GPU, then even better; rainbow and sunshine.
What happens with TensorFlow Lite? Evidently, TensorFlow Lite has two parts: a converter and an interpreter. The conversion happens on a desktop using TOCO – this is the converter. The interpreter is designed and optimized for mobile and edge devices.
If you are wondering if TensorFlow Lite can be built with custom CPU, maybe compiling TensorFlow Lite to use optimized atom CPU, I found this discussion on Stack Overflow.
Also, a good and short discussion here on how TensorFlow Lite is the best way to run machine Learning models on android. But what really was eye opening is this video
Put a comment when you see any competition such as those on Kaggle in regards to TensorFlow Lite application. I found this short discussion there. The Dev summit 2019 link is below:
Farzan Jafeh, PhD – Aug 2019