Build TensorFlow Lite model with Firebase AutoML Vision Edge

Train first image classification model with Firebase ML Kit

For more than a year now, Firebase – backend platform for mobile and web development, has ML Kit SDK in its portfolio. Thanks to this feature, it is way easier to implement machine learning solutions in mobile apps, regardless of ML skills we have. With APIs like Text Recognition or Image Labeling, we can add those functionalities to our app with a couple of lines of code.
ML Kit also provides a simple way for plugging-in custom machine learning solutions – we provide TensorFlow Lite model, and Firebase is responsible for deploying it into our app – multiplatform (Android and iOS), offline or online (model can be bundled with app on downloaded on-demand in runtime), with a simplified code for implementing an interpreter. 

Continue reading

Automate testing of TensorFlow Lite model implementation

Testing TensorFlow Lite model with Espresso and instrumentation tests on Android

Making sure that your ML model works correctly on mobile app (part 2)

This is the 2nd article about testing machine learning models created for mobile. In the previous post – Testing TensorFlow Lite image classification model, we built a notebook that exports TensorFlow model to TensorFlow Lite and compares them side by side. But because the conversion process is mostly automatic, there are not many places to break something. We can find differences between quantized and non-quantized models or ensure that TensorFlow Lite works similarily to TensorFlow, but the real issues can come up somewhere else – on the client side implementation.
In this article, I will suggest some solutions for testing TensorFlow Lite model with Android instrumentation tests.

Continue reading

TensorFlow to CoreML conversion and model inspection

Convert TensorFlow model to CoreML

Converting TF models to CoreML, an iOS-friendly format

Core ML

While TensorFlow Lite seems to be a natural choice for Android software engineers, on iOS, it doesn’t necessarily have to be the same. In 2017, when iOS 11 was released, Apple announced Core ML, a new framework that speeds up AI-related operations.
If you are fresh in machine learning on mobile, Core ML will simplify things a lot when adding a model to your app (literally drag-and-drop setup). It also comes with some domain-specific frameworks – Vision (computer vision algorithms for face, rectangles or text detection, image classification, etc.), and Natural Language.
Core ML and Vision give us a possibility to run inference process with the use of custom machine learning model. And those models may come from machine learning frameworks like TensorFlow.
In this article, we will see how to convert TensorFlow model to CoreML format and how to compare models side by side.

Continue reading

Testing TensorFlow Lite image classification model

Testing TensorFlow Lite classification model

Make sure that your ML model works correctly on mobile app (part 1)

Looking for how to automatically test TensorFlow Lite model on a mobile device? Check the 2nd part of this article.

Building TensorFlow Lite models and deploying them on mobile applications is getting simpler over time. But even with easier to implement libraries and APIs, there are still at least three major steps to accomplish:

  1. Build TensorFlow model,
  2. Convert it to TensorFlow Lite model,
  3. Implement in on the mobile app.

There is a set of information that needs to be passed between those steps – model input/output shape, values format, etc. If you know them (e.g. thanks to visualizing techniques and tools described in this blog post), there is another problem, many software engineers struggle with.

Why the model implemented on a mobile app works differently than its counterpart in a python environment?

Software engineer

In this post, we will try to visualize differences between TensorFlow, TensorFlow Lite and quantized TensorFlow Lite (with post-training quantization) models. This should help us with early models debugging when something goes really wrong.
Here, we will focus only on TensorFlow side. It’s worth to remember, that it doesn’t cover mobile app implementation correctness (e.g. bitmap preprocessing and data transformation). This will be described in one of the future posts.

Continue reading

Inspecting TensorFlow Lite image classification model

Inspecting TensorFlow Lite image classification model

What to know before implementing TFLite model in mobile app

In previous posts, either about building a machine learning model or using transfer learning to retrain existing one, we could look closer at their architecture directly in the code. But what if we get *.tflite model from an external source? How do we know how to handle it properly? In this blog post, we’ll look closer at what we can do to get enough knowledge for plugging-in TensorFlow Lite image classification model into Android application.

Continue reading

Traffic signs classification with retrained MobileNet model

Traffic signs classification with retrained MobileNet model

TensorFlow Lite classification model for GTSRB dataset

This post is a part of a series about building Machine Learning solutions in mobile apps. In the previous article, we started from building simple MNIST classification model on top of TensorFlow Lite. That post is also a good place to start if you are looking for some hints about how to set up your very first environment (local with Docker or remote with Colaboratory).

Let’s continue with basics. If you spent some time exploring the Internet for Machine Learning <-> mobile solutions, for sure you found “TensorFlow for Poets” code labs. If not, those are places where you should start your journey with building a more complex solution for apps vision intelligence.

Those code labs are focused on building very first working solution that can be launched directly on your mobile device. And here, we’ll build something very similar, with some additional explanation that can be helpful with understanding TensorFlow Lite a little bit better.

MobileNet

So what are code labs and this article about? They all show how to build a convolutional neural network that is optimized for mobile devices, with a little effort required for defining the structure of the Machine Learning model. Instead of building it from scratch, we’ll use a technique called Transfer Learning and retrain MobileNet for our needs.

MobileNet itself is a lightweight neural network used for vision applications on mobile devices. For more technical details and great visual explanation, please take a look at Matthijs Hollemans’s blog post: Google’s MobileNets on the iPhone (it says “iPhone” 😱, but the first part of the post is fully dedicated to MobileNet architecture). And if you want even more technical details, the paper titled MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications will be your friend.

Continue reading

TensorFlow Lite classification on Android (with support for TF2.0)

TensorFlow Lite classification on Android (TF2.0 support)

Adding the first Machine Learning model into your mobile app

*** Edit, 23.04.2019 ***

TensorFlow 2.0 experimental support
In the repository, you can find Jupyter Notebook with the code running on TensorFlow 2.0 alpha, with the support for GPU environment (up to 3 times faster learning process). As this is not yet stable version, the entire code may break in any moment. The notebook was created just for the Colaboratory environment. It requires some changes to make it working on Docker environment described in the blog post.
The notebook is available here.

TensorFlow 2.0 experimental support
In the repository, you can find Jupyter Notebook with the code running on TensorFlow 2.0 alpha, with the support for GPU environment (up to 3 times faster learning process). As this is not yet stable version, the entire code may break in any moment. The notebook was created just for the Colaboratory environment. It requires some changes to make it working on Docker environment described in the blog post.
The notebook is available here.

— — —

This is my new series about using Machine Learning solutions in mobile apps. As the opposition to the majority of articles, there will be not much about building layers, training processes, fine-tuning, playing with Google TPUs and data science in general. 
Instead, we’ll focus on understanding how to plug in models into apps, use, debug and optimize them, and be effective in the cooperation with data scientists and AI engineers.

MNIST

For sure you saw countless examples of how to implement MNIST classifier. Therefore, for the sake of the series completeness, I decided to implement it one more time. Maybe it’s not very challenging from ML perspective, but it’s still a good example to show how to work with TensorFlow Lite models in a mobile app.

In this blog post, we’ll create a simple Machine Learning model that detects a handwritten number presented on an image. The model will be converted to TensorFlow Lite and plugged into Android application, step by step.

Continue reading