Architecture Components:Paging Library

Many Applications need to load a lot of information from the database.Database queries can take a long time to run and use a lot of memory.Android has a new paging library that can help you with all of this.

Main Components of Paging Library


The main components of the paging library are PagedListAdapter, that actually extends the RecyclerViewAdapter, PagedList, and DataSource.

Component Of Paging Library

DataSource: The DataSource is an interface for page source to provide the data gradually.But you’ll need to implement one of the two DataSource, either a Keyed DataSource or TiledDataSource, which will be used when you need to load item N based on the item N-1.

  • Use KeyedDataSource if you need to use data from item N to fetch item N+1. For example, if you’re fetching threaded comments for a discussion app, you might need to pass the ID of one comment to get the contents of the next comment.
  • Use TiledDataSource if you need to fetch pages of data from any location you choose in your data store. This class supports requesting a set of data items beginning from whatever location you select, like “Return the 20 data items beginning with location 1200”.
  • loadCount():You also need to implement another method,loadCount()This tells whether you have an infinite or finite amount of items that you need to display in your list.

If you use the Room persistence library to manage your data, it can create the DataSource class for you automatically. For example, here’s a query that returns a TiledDataSource:

PagedList: The PagedList is a component that loads the data automatically and can provide update signal, for example, to the RecyclerViewAdapter. The data is loaded automatically from a DataSource on the background thread.But then, it’s consumed on the main thread.It supports both an infinite scrolling list, but also countable lists.

You can configure several things.You can configure the initial load size, the page size, but also the prefetch distance.

PagedListAdapter:This class is an implementation of RecyclerView.Adapter that presents data from a PagedList. For example, when a new page is loaded, the PagedListAdapter signals the RecyclerView that the data has arrived; this lets the RecyclerView replace any placeholders with the actual items, performing the appropriate animation.

The PagedListAdapter also uses a background thread to compute changes from one PagedList to the next (for example, when a database change produces a new PagedList with updated data), and calls the notifyItem…()methods as needed to update the list’s contents. RecyclerView then performs the necessary changes. For example, if an item changes position between PagedList versions, the RecyclerView animates that item moving to the new location in the list.

DataFlow


Paging data flow

Let’s say that we have some data that we put on the DataSource on the background thread.The DataSource invalidates the PagedList and updates its value.Then on the main thread, the PagedList notifies its observers of the new value.So now the PagedListAdapter knows about the new value.So on the background thread, the PageListAdapter needs to compute what’s changed, whats’ the difference.Then, back on the UI thread, the View is updated in the onBinderViewHolder.So all of this happens automatically.You just insert an item in that database, and then you see it animated in and no UI code is required.

Paging Library Example

Architecture Components Paging Demo

1.Adding Components to your Project


Architecture Components are available from Google’s Maven repository. To use them, follow these steps:

Open the build.gradle file for your project and add the line as shown below:

Add Architecture Components

In this tutorial, we are using Room, LiveData, and ViewModel.

Open the build.gradle file for your app or module and add the artifacts that you need as dependencies:

2.Create DataSource


Live Paged List Provider

Create Entity

Represents a class that holds a database row. For each entity, a database table is created to hold the items. You must reference the entity class in the Database class.

Data Access Objects (DAO)

To simplify the connection between the DataSource and the RecyclerView, we can use a LivePagedListProvider.So this will expose, actually, a LiveData of a PageList of our user.So all you will need to do is provide a DataSource.But if that DataSource is true, then it will be generated for you in the DAO.You don’t need to write any invalidating handling code.You can simply bind the LiveData of a PagedList to a PagedListAdapter and get updates, invalidates, and lifecycle cleanup with a single line of binding code.

So in our user DAO, we would return a LivePagedListProvider of our user to get the users by the last name.

Create Database

The annotation defines the list of entities, and the class’s content defines the list of data access objects (DAOs) in the database. It is also the main access point for the underlying connection.

The annotated class should be an abstract class that extends RoomDatabase.

3.Create ViewModel


In the ViewModel, we would extend from the Architecture Component ViewModel, and then we would keep a reference to that LiveData of our PagedList and we will get that reference from the DAO by calling getUsers(), and then call Create using the configuration that you want.So for example, setting the page size to 50, setting the prefetch distance to 50 and so on.

In the onCreate, we get the reference to our ViewModel.We get the reference to the RecyclerView, and we create our adapter.

4.Create Adapter


To tell the PagedListAdapter how to compute the difference between the two elements, you’ll need to implement a new class, DiffCallback.Here, you will define two things.

You will define how to compute whether the contents are the same, and how to define whether the items are the same.

Let’s look at the adapter.So our adapter would extend PagedListAdapter, and then it will connect the user, which is the information that’s being displayed, with the user ViewHolder.

We define the callback, the DIFF_CALLBACK, for our user objects and then in onBindViewHolder, all we need to do is bind the item to the ViewHolder.That’s all.

Download this project from GitHub

 

Conclusion

Android has a lot of new concepts and components with architecture Components.But the thing is, you can them separately.So if you want, you’ll only be able to use lifecycle, LiveData, and PagedList or only ViewModel, or only Room.But you can also use them together.So start using the Architecture Components to create a more testable architecture for your application.

 

Autosizing TextViews Using Support Library 26.0

Material design recommends using a dynamic type text instead of smaller type sizes or truncating large size text.Android making this much easier to implement with the introduction of TextView auto-sizing.With Android O and Support Library 26.0, TextView gains a new property auto-size text type which allows you to optimize the text size when working with dynamic content.

Autosizing TextViews

Adding support library dependency


The Support Library 26.0 provides full support to the auto sizing TextView feature on devices running Android versions prior to Android 8.0 (API level 26). The android.support.v4.widget package contains the TextViewCompat class to access features in a backward-compatible fashion.

Support Library 26 has now been moved to Google’s maven repository, first include that in your project level build.gradle.

Add the support library in your app level build.gradle.

Enable Autosizing


To enable auto-size in XML, set autoSizeTextType to uniform.This scales the text uniformly on horizontal and vertical axes, ignoring the text size attribute.When using support Library, make sure you use the app namespace.Note that you shouldn’t use wrap_content for layout width or layout height for a textView set to auto-size since it may produce unexpected results.Instead, use match_prent or a fixedsize.

Turn off auto-sizing by selecting none instead of uniform. You can also use auto-size programmatically like this.

Provide an instance of the TextView widget and one of the text types, such asTextViewCompat.AUTO_SIZE_TEXT_TYPE_NONE or TextViewCompat.AUTO_SIZE_TEXT_TYPE_UNIFORM.

Customize TextView

If you want to customize your TextView more, it has some extra attributes for you to auto-size min and max text size and step granularity.The TextView will scale uniformly in the range between the minimum and the maximum size in increments of step granularity.If you don’t set these properties, the default values will be used.

To define a range of text sizes and a dimension in XML, use the app namespace and set the following attributes:

 

To define a range of text sizes and a dimension programmatically, call the setAutoSizeTextTypeUniformWithConfiguration(int autoSizeMinTextSize, int autoSizeMaxTextSize, int autoSizeStepGranularity, int unit) method. Provide the maximum value, the minimum value, the granularity value, and any TypedValue dimension unit.

Preset Sizes

To have more control over the final size, for example, your app needs to comply with specific text size design guidelines, you can provide a list of size, and it will use the largest one that fits.

Create an array with the size in your resources and then set the auto-size present sizes attribute in the XML.

To use preset sizes to set up the auto-sizing of TextView programmatically through the support library, call theTextViewCompat.setAutoSizeTextTypeUniformWithPresetSizes(TextView textView, int[] presetSizes, int unit) method. Provide an instance of the TextView class, an array of sizes, and any TypedValue dimension unit for the size.

 

 

How to Create Instant app from Existing App

Android Instant Apps allows Android users to run your apps instantly, without installation. Users can get to your flagship Android experience from any URL—including search, social media, messaging, and other deep links—without needing to install your app first.Android Instant Apps supports the latest Android devices from Android 6.0 through Android O. Google will be rolling out to more devices soon, including expanding support to Android 5.0 (API level 21) devices shortly.

How does Instant app work?


When Google Play receives a request for a URL that matches an instant app, it sends the necessary code files to the Android device that sent the request. The device then runs the app.

How does Instant app work

Structure of the Instant App


  • Base feature module: The fundamental module of your instant app is the base feature module. All other feature modules must depend on the base feature module. The base feature module contains shared resources, such as activities, fragments, and layout files. When built into an instant app, this module builds a feature APK. When built into an installed app, the base feature module produces an AAR file.
  • Features: At a very basic level, apps have at least one feature or thing that they do: find a location on a map, send an email, or read the daily news as examples. Many apps provide multiple features.
  • Feature ModulesTo provide this on-demand downloading of features, you need to break up your app into smaller modules and refactor them into feature modules.
  • Feature APKs: Each feature APK is built from a feature module in your project and can be downloaded on demand by the user and launched as an instant app.

 

Each feature within the instant app should have at least one Activity that acts as the entry-point for that feature. An entry-point activity hosts the UI for the feature and defines the overall user flow. When users launch the feature on their device, the entry-point activity is what they see first. A feature can have more than one entry-point activity, but it only needs one.

Structure of Instant App

 

As you see in the figure, both “Feature 1” and “Feature 2” depend on the base feature module. In turn, both the instant and installed app modules depend on the feature 1 and feature 2 modules. All three feature modules are shown in figure -base feature, feature 1, and feature 2—have the com.android.feature plugin applied to their build configuration files.

Upgrade Your Existing App

Android Instant Apps functionality is an upgrade to your existing Android app, not a new, separate app. It’s the same Android APIs, the same project, and the same source code. Android Studio provides the tools you need to modularize your app so that users load only the portion of the instant app that they need when they need it.

Step 1: Develop a use case for your instant App


Focus on a core user experience that completes a specific action and optimizes a key business metric besides app installs. Then review the user experience guidelines for Android Instant Apps.

Step 2: Set up your development Environment


To develop an instant app, you need the following:

Install Instant App SDK

Build an Android Instant App, we need to install the SDK. Go to Tools >Android > SDK Manager. Click on the “SDK Tools” tab and install “Instant Apps Development SDK” by checking the box and hitting “Apply”

Install Instant App SDK

Set up your device or emulator

You can develop instant apps on the following devices and emulators:

  • Devices: Nexus 5X, Nexus 6P, Pixel, Pixel XL, Galaxy S7 running Android 6.0 or higher.
  • Emulator: Nexus 5X image running Android 6.0 (API level 23), x86, with Google APIs(You cannot use x86_64 architectures).

Step 3: Moving existing code into a feature module


In this step, we will convert the existing application module into a shareable feature module. We will then create a minimal application module that has a dependency on the newly formed feature. Note that this feature module will be included into the Instant App build targets later. 

Convert the app module into a feature module called app-base

We start with renaming the module from 'app' to 'app-base':

create base module

Change Module type

Next, we change the module type to Feature module by changing the plugin type from com.android.application to com.android.feature  and also remove applicationId   because this is no longer an application module in the app-base/build.gradle file:

Specify base feature in the project  app-base/build.gradle

Synchronize gradle files and re-build the project with Build->Rebuild Project.

Step 4: Create appapk module to build APK file


Now that we have transformed our source code into a reusable library module, we can create a minimal application module that will create the APK. From File->New Module

Create New Module

Enter application name “app apk”, leave suggested module name (topekaapk).

If your project uses Data Binding, you need to ensure that appaapk/build.gradle includes the following in the android { ... } section.

Replace compile dependencies in appaapk/build.gradle:

Switch to “Project” view and remove unused files:Instant app remove unused folder

Switch back to “Android view” and remove the application element from appaapk/src/main/AndroidManifest.xml. It should only contain this single manifest element.

Finally sync Gradle files, re-build and run the project. The application should behave exactly the same despite all of our changes.

Create Feature module

We have just moved the application’s core functionality into a shareable feature module and we are now ready to start adding in the Instant App modules.

Step 5: Creating the instant app APK


Instant Apps uses feature APKs to break up an app into smaller, feature focused modules. One way to look at Instant Apps is a collection of these feature APKs. When the user launches a URL, Instant Apps will only deliver the necessary feature APKs to provide the functionality for that URL.

The app-base module feature APK that encompasses the full functionality of our app. We will create an Instant App module that bundles our single feature APK. At the end, we are going to have our single feature instant app!

single feature instant app

The Instant App module is merely a wrapper for all the feature modules in your project. It should not contain any code or resources.

Create an Instant App module

Select File -> New -> New Module
Create Instant App

Next, we need to update the instant app gradle file to depend on the base feature module. At this point we also need to add buildToolsVersion to explicitly use 26.0.1 since there is an issue in the current Android preview which makes the default 26.0.0. Since we’ve installed 26.0.1 with the project we don’t want to fetch 26.0.0 necessarily.

instantapp/build.gradle

The instant app does not hold any code or resources.It contains only a build.gradle file.

Now do a clean rebuild: Build -> Rebuild project.

Step 6: Defining App Links


App links are required in instant apps because URLS are the only way a user can launch an instant app (instant apps are not installed).Each entry-point activity in an instant app needs to be addressable: it needs to correspond to a unique URL address. If the URL addresses for the features in an instant app share a domain, each feature needs to correspond to a different path within that domain.

In order to enable Instant App runtime to call your application, we must establish the relationship between your web site and the app. To associate links, we will use a new feature built into Android Studio called “App Links Assistant.”. Invoke this tool through the Android Studio “Tools” menu:

App Links Assistant

From the sidebar, tap on the “Open URL Mapping Editor” button:

add url intent filters

From the editor, tap the “+” button to add a new app link entry:

Create a new URL mapping with the following details:

Host: http://topeka.samples.androidinstantapps.com

Path: pathPattern/signin

Activity: activity.SigninActivity (app-base)

map url to activity

Repeat this dialog for https variation, as well as other links:

In the end, you should have 4 mappings like this:

Url to Activity mapping

Sync gradle files if required and rebuild the project.

Again, since we have not used Android Studio wizard, the run configuration for instant app is not valid, we need to define the URL before we can launch the instant app from the IDE.

Click the Run configuration dropdown and choose “Edit Configurations…”

Select instantapp under Android App.

Edit Configuration

Replace the text ‘<< ERROR – NO URL SET>>’ with https://topeka.samples.androidinstantapps.com/signin

To run your Instant App, select instantapp from the Run configuration dropdown and click Run:

Run Instant APP

 

Now, you have created and deployed an Android Instant App. You took an existing Android application and restructured it to build both a full installed APK and an Instant APK that will be loaded when the user taps on the associated URLs.

Conclusion


You may need to separate the code in multiple features for various reasons, the most obvious one is feature separation, to let users download only the relevant portions of the app.We will create another feature module and move all UI code there (activities and related fragments). It will let us create two features (

You will create another feature module and move all UI code there (activities and related fragments). It will create two features (app-base and app-ui) later.

 

 

Speech Recognition Using TensorFlow

This tutorial will show you how to runs a simple speech recognition model built by the audio training tutorial. Listens for a small set of words, and display them in the UI when they are recognized.

It’s important to know that real speech and audio recognition systems are much more complex, but like MNIST for images, it should give you a basic understanding of the techniques involved. Once you’ve completed this tutorial, you’ll have a application that tries to classify a one second audio clip as either silence, an unknown word, “yes”, “no”, “up”, “down”, “left”, “right”, “on”, “off”, “stop”, or “go”.

TensorFow speech recognition model

1.Preparation


You can train your model on the desktop or on the laptop or on the server and then you can use that pre-trained model on our mobile device.So there’s no training that would happen on the device the training would happen on our bigger machine either a server or our laptop.You can download a pretrained model from tensorflow.org

2. Adding Dependencies


The TensorFlow Inference Interface is available as a JCenter package and can be included quite simply in your android project with a couple of lines in the project’s build.gradle file:

Add the following dependency in app’s build.gradle

This will tell Gradle to use the latest version of the TensorFlow AAR that has been released to https://bintray.com/google/tensorflow/tensorflow-android. You may replace the + with an explicit version label if you wish to use a specific release of TensorFlow in your app.

3.Add Pre-trained Model to Project


You need the pre-trained model and label file.You can download the model from here.Unzip this zip file, You will get conv_actions_labels.txt(label for objects) and conv_actions_frozen.pb(pre-trained model).

Put conv_actions_labels.txt and conv_actions_frozen.pb into android/assets directory.

4.Microphone Permission


To request microphone, you should be requesting RECORD_AUDIO permission in your manifest file as below:

Since Android 6.0 Marshmallow, the application will not be granted any permission at installation time. Instead, the application has to ask the user for a permission one-by-one at runtime.

5.Recording Audio


The AudioRecord class manages the audio resources for Java applications to record audio from the audio input hardware of the platform. This is achieved by “pulling” (reading) the data from the AudioRecord object. The application is responsible for polling the AudioRecord object in time using read(short[], int, int).

6.Run TensorFlow Model


TensorFlowInferenceInterface class that provides a smaller API surface suitable for inference and summarizing the performance of model execution.

7.Recognize Commands


RecognizeCommands class is fed the output of running the TensorFlow model over time, it averages the signals and returns information about a label when it has enough evidence to think that a recognized word has been found. The implementation is fairly small, just keeping track of the last few predictions and averaging them.

The demo app updates its UI of results automatically based on the labels text file you copy into assets alongside your frozen graph, which means you can easily try out different models without needing to make any code changes. You will need to updaye LABEL_FILENAME and MODEL_FILENAME to point to the files you’ve added if you change the paths though.

8.conclusion


You can easily replace it with a model you’ve trained yourself. If you do this, you’ll need to make sure that the constants in the main MainActivity Java source file like SAMPLE_RATE and SAMPLE_DURATION match any changes you’ve made to the defaults while training. You’ll also see that there’s a Java version of the RecognizeCommands module that’s very similar to the C++ version in this tutorial. If you’ve tweaked parameters for that, you can also update them in MainActivity to get the same results as in your server testing.

 

Download this project from GitHub

 

Related Post

Android TensorFlow Machine Learning

Google Cloud Speech API in Android APP

 

 

 

EmojiCompat Support Library

Have you seen☐this blank square character called “tofu” when an app can’t display an emoji? New emojis are constantly being added to the Unicode standards, but since they are bundled as a font, your phone’s emojis are set in stone with each OS release.Well, they were.

With the EmojiCompat library(part of the Support Library 26) your app can get backward-compatible emoji support on devices with API level 19+ and get rid of tofu.

How does EmojiCompat work?

EmojiCompat Process

For a given char sequence, EmojiCompat can identify the emojis, replace them with the EmojiSpan, and then render the glyphs.On versions prior to API level 19, you’ll still get the tofu characters.

EmojiCompat build on the new font mechanism to make sure you always have the latest emoji available.

Downloadable fonts configuration

The downloadable fonts configuration uses the Downloadable Fonts support library feature to download an emoji font. It also updates the necessary emoji metadata that the EmojiCompat support library needs to keep up with the latest versions of the Unicode specification.

Adding support library dependency

Support Library 26 has now been moved to Google’s maven repository, first include that in your project level build.gradle.

Add the support library in your app level build.gradle.

Adding certificates

When a font provider is not preinstalled or if you are using the support library, you must declare the certificates the font provider is signed with. The system uses the certificates to verify the font provider’s identity.

Create a string array with the certificate details.

Initializing the downloadable font configuration

Before using EmojiCompat, the library needs a one-time asynchronous setup(in application class).

When a downloadable font configuration, create your FontRequest, and the FontRequestEmojiCompatConfig object.

It depends on the way you’re using it, the initialization takes at least 150 ms, even up to a few seconds. So you might want to get notified about its state.for this, use the registerInitCallback method.

Use EmojiCompat widgets in layout XMLs. If you are using AppCompat, refer to the Using EmojiCompat widgets with AppCompat section.

You can preprocess a char Sequence using the process method. You can then reuse the result instead of the initial registering in any widget that can render spanned instances.So, for example, if you’re doing your own custom drawing, you can use this to display emoji text.

 

Download this project from GitHub

Fast Scrolling in RecyclerView Using Support Library 26

InListView, you could have a fast scroller which allowed you to drag a scrollbar to easily scroll to wherever you wished using fastScrollEnabled attribute.With Support Library 26, you can also easily enable fast scrolling for RecyclerView.

Fast Scrolling RecyclerView

 Add Dependency

Support Library 26 has now been moved to Google’s maven repository, first include that in your project level build.gradle.

Add Support Library 26 in your app level build.gradle.

Enable Fast Scrolling

If fastScrollEnabled boolean flag for RecyclerView is enabled then,fastScrollHorizontalThumbDrawable,fastScrollHorizontalTrackDrawable, fastScrollVerticalThumbDrawable, and fastScrollVerticalTrackDrawable must be set.

  • fastScrollEnabled : boolean value to enable the fast scrolling. Setting this as true will require that must provide the following four properties.
  • fastScrollHorizontalThumbDrawable : A StateListDrawable that will be used to draw the thumb which will be draggable across the horizontal axis.
  • fastScrollHorizontalTrackDrawable : A StateListDrawable that will be used to draw the line that will represent the scrollbar on horizontal axis.
  • fastScrollVerticalThumbDrawable : A StateListDrawable that will be used to draw the thumb which will be draggable on vertical axis.
  • fastScrollVerticalTrackDrawable : A StateListDrawable that will be used to draw the line that will represent the scrollbar on vertical axis.

 

Now, Create native shapes StateListDrawables.

line.xml

line_drawable.xml

thumb.xml

thumb_drawable.xml

 

Now Create RecyclerView and enable fastScrollEnabled.

 

Download this project from GitHub