Traffic signs classification with retrained MobileNet model

Traffic signs classification with retrained MobileNet model

TensorFlow Lite classification model for GTSRB dataset

This post is a part of a series about building Machine Learning solutions in mobile apps. In the previous article, we started from building simple MNIST classification model on top of TensorFlow Lite. That post is also a good place to start if you are looking for some hints about how to set up your very first environment (local with Docker or remote with Colaboratory).

Let’s continue with basics. If you spent some time exploring the Internet for Machine Learning <-> mobile solutions, for sure you found “TensorFlow for Poets” code labs. If not, those are places where you should start your journey with building a more complex solution for apps vision intelligence.

Those code labs are focused on building very first working solution that can be launched directly on your mobile device. And here, we’ll build something very similar, with some additional explanation that can be helpful with understanding TensorFlow Lite a little bit better.

MobileNet

So what are code labs and this article about? They all show how to build a convolutional neural network that is optimized for mobile devices, with a little effort required for defining the structure of the Machine Learning model. Instead of building it from scratch, we’ll use a technique called Transfer Learning and retrain MobileNet for our needs.

MobileNet itself is a lightweight neural network used for vision applications on mobile devices. For more technical details and great visual explanation, please take a look at Matthijs Hollemans’s blog post: Google’s MobileNets on the iPhone (it says “iPhone” 😱, but the first part of the post is fully dedicated to MobileNet architecture). And if you want even more technical details, the paper titled MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications will be your friend.

Continue reading

TensorFlow Lite classification on Android (with support for TF2.0)

TensorFlow Lite classification on Android (TF2.0 support)

Adding the first Machine Learning model into your mobile app

*** Edit, 23.04.2019 ***

TensorFlow 2.0 experimental support
In the repository, you can find Jupyter Notebook with the code running on TensorFlow 2.0 alpha, with the support for GPU environment (up to 3 times faster learning process). As this is not yet stable version, the entire code may break in any moment. The notebook was created just for the Colaboratory environment. It requires some changes to make it working on Docker environment described in the blog post.
The notebook is available here.

TensorFlow 2.0 experimental support
In the repository, you can find Jupyter Notebook with the code running on TensorFlow 2.0 alpha, with the support for GPU environment (up to 3 times faster learning process). As this is not yet stable version, the entire code may break in any moment. The notebook was created just for the Colaboratory environment. It requires some changes to make it working on Docker environment described in the blog post.
The notebook is available here.

— — —

This is my new series about using Machine Learning solutions in mobile apps. As the opposition to the majority of articles, there will be not much about building layers, training processes, fine-tuning, playing with Google TPUs and data science in general. 
Instead, we’ll focus on understanding how to plug in models into apps, use, debug and optimize them, and be effective in the cooperation with data scientists and AI engineers.

MNIST

For sure you saw countless examples of how to implement MNIST classifier. Therefore, for the sake of the series completeness, I decided to implement it one more time. Maybe it’s not very challenging from ML perspective, but it’s still a good example to show how to work with TensorFlow Lite models in a mobile app.

In this blog post, we’ll create a simple Machine Learning model that detects a handwritten number presented on an image. The model will be converted to TensorFlow Lite and plugged into Android application, step by step.

Continue reading