Sitemap
ProAndroidDev

The latest posts from Android Professionals and Google Developer Experts.

Follow publication

Android: Firebase Machine Learning Kit 101

--

Note: This is part 1 of 2 articles on Firebase’s ML Kit. This first part is more of an introduction to the topic, while the second part is about building an Android application using ML Kit’s APIs.

With the tremendous evolution of interest in machine learning in the last couple of years, it’s only natural to see a product like Firebase ML (Machine Learning) Kit see the day. Nowadays, we dispose of enough computational power to run machine learning models on our mobile devices, a task which previously was difficult (but still, doable). Besides, the acquisition of sufficient data both in the quality and quantity needed is tedious, and the development of models optimized for mobile usage -in terms of size, computation power and battery consumption- is not simple at all.

With this being said, ML kit was built in order to bring machine learning to mobile (both Android and iOS) in an easy to use way (without even requiring prior knowledge of machine learning), but yet in a powerful and useful way. For the moment it is vision focused and offers the following features:

  • Text recognition
  • Face detection
  • Barcode scanning
  • Image labeling
  • Landmark recognition

And coming soon are 2 additional features which are a face contour feature (which is able to detect more than 100 points around the face and processes them at 60fps) and a smart reply feature.

The Stack

The machine learning kit stack is made up of 3 layers:

  • ML Kit SDK
  • TensorFlow Light: A set of machine learning tools built from the ground up to be light-weight. It is used both on mobile and embedded devices.
  • Neural Networks API for Android (or Metal for iOS): Which is an API designed for running computationally intensive operations for machine learning and designed to provide a base layer of functionality for higher-level machine learning frameworks (such as TensorFlow Light in this instance).

On-device vs Cloud-based APIs

Firebase ML Kit offers 2 options when working with its APIs (which code-wise, follow the same pattern, so switching from one to another is fairly easy).

  • On-device APIs: Which are free, run on the device in real time, can function in offline mode but offer limited accuracy.
  • Cloud-based APIs: Offers more precision and accuracy in exchange of using some data. Oh! And did I mention it isn’t free -obviously-?

Google models vs Custom models

With Firebase ML Kit, you can either rely on the predefined models already available or use your own. The first option means that as a developer, you won’t have to deal with any machine learning, none is involved. On the other hand, using your custom model means that you’d be able to upload your model to Firebase and serve it to your users, allowing your app to be lighter (in terms of size) and more importantly, decoupling the app from the machine learning models release processes. Besides, coupling ML Kit with other Firebase products such as A/B testing and Analytics will allow you to play around with your models, run tests, gather insights and further develop your product.

To sum up, Firebase ML Kit offers a powerful, useful and easy-to-use set of machine learning features with image recognition capabilities. In part 2 of this article, I try it out on a sample Android application, go check it out!

For more about Firebase ML Kit, watch Firebase ML Kit’s Google I/O ’18 session, or read about it on it’s official page.

For more on Java, Kotlin and Android, follow me to get notified when I write new posts, or let’s connect on Github and Twitter!

--

--

ProAndroidDev
ProAndroidDev

Published in ProAndroidDev

The latest posts from Android Professionals and Google Developer Experts.

Husayn Hakeem
Husayn Hakeem

Written by Husayn Hakeem

Android @ Airbnb. Formerly Google.

No responses yet