It’s a new API for neural network processings inside Android and that will be added to the Android framework.The purpose of adding a new API is to encapsulate and have an abstraction layer for the hardware accelerators such as GPU, DSP, and ISP.Modern smartphones have such a powerful computing resource other than CPU, such as GPU or DSP.Especially the DSP is designed to do a massive amount of the matrix and vector calculations, So it’s much faster to use DSP or GPUs to do the neural networks inference on it, rather than using a CPU. But right now, If you want to do that, you have to go directly into the library provided by the hardware vendors and building some binaries by yourself.It’s a tedious task.So instead, Android will be providing a standard API so that developers don’t have to be aware of any of the hardware accelerators from each individual vendor.On top of the Neural Network API, Android will be providing TensorFlow Lite.That will be a new TensorFLow runtime optimized for mobile and embedded applications.
TensorFlow Lite is designed for smart and compact mobile or embedded applications. It is designed to be combined with the Android Neural Network API.So all you have to do is to build your model with TensorFlow Lite, and that’s it. Eventually, you will be getting all the benefits you could get from the Android Neural Network API, such as the hardware acceleration.
TensorFlow Lite and the neural network API will be made available later in an update to O this year.