What TensorFlow Lite is and what it can do for us ?



TensorFlow Lite is basically a toolkit for running ML inference on device. 
Steps:

  1. You take the ML model 
  2. Use the TenserFlow Lite converter to convert it to a form that will work really well on embedded or mobile devices. 
  3. Then the TensorFlow Lite interpreter will run the model and run inference for you on devices. 


NOte* YOu can also use the collection of models as well.
If you don't have experience training with Python or whatever language you prefer, you can just take one of the off-the-shelf models and use that for your solutions.


@ Which platforms can we run TensorFlow Lite in ? 

Tensorflow Lite is designed to run ML inference across mobile platforms and embedded platforms.
Android, iOS.  (java, kotlin, swift, objective c), c ++ for tiny micro controllers, raspberry pi (embedded linux)

This models have been hyper optimized to run on these deployment targets.
LOts of work has been done like model pruning, quantisation to keep things super accurate, speedy and binary size.

Can we use tensorflow like in GPU and TPU hardware ? 

- TensorFlow Lite make it super easy to run inference on GPUs in mobile devices.
- It also supports TPU, which is a hardware accelerator designed by Google for doing inference at the edge extremely fast. (coral.google.com)

Example : In Pixel the predicting capacity of emoji and text has gotten much better recently.

When we think about ML, we think about greater models like CNNs for image classification.
When we want to run this model in a resources constrained hardware like mobile devices we have to take care of things so that this model runs efficiently, so this this models need to be optimized under such constraints. All this is possible in TensorFlow Lite.










Comments

Popular Posts