Room database Migrating

When upgrading your Android application you often need to change its data model. When the model is stored in SQLite database, then its schema must be updated as well.
However, in a real application, migration is essential as we would want to retain the user existing data even when they upgrade the App.

The Room persistence library allows you to write Migration classes to preserve user data in this manner. Each Migration class specifies a startVersion and endVersion. At runtime, Room runs each Migration class’s migrate() method, using the correct order to migrate the database to a later version.

A migration can handle more than 1 version (e.g. if you have a faster path to choose when going version 3 to 5 without going to version 4). If Room opens a database at version 3 and latest version is >= 5, Room will use the migration object that can migrate from 3 to 5 instead of 3 to 4 and 4 to 5.

If there are not enough migrations provided to move from the current version to the latest version, Room will clear the database and recreate so even if you have no changes between 2 versions, you should still provide a Migration object to the builder.

Create New Entity Or Add New Columns 

The following code snippet shows how to define an entity:

migrate() method is already called inside a transaction and that transaction might actually be a composite transaction of all necessary Migrations.

After the migration process finishes, Room validates the schema to ensure that the migration occurred correctly. If Room finds a problem, it throws an exception that contains the mismatched information.


Related Post

Room Persistence Library

How to use DateTime datatype in SQLite Using Room

Room: Database Relationships


TensorFlow Lite

What is TensorFlow?

Implement the Machine Learning or AI-powered applications running on mobile phones it may be easiest and the fastest way to use TensorFlow. which is the open source library for Machine Learning.TensorFlow is some google standard framework for building new ML or AI basis product. So this is a standard play mapper Machine Learning in google and created by Google brain team and Google has opensource in 2015.TensorFlow is scalable and portable. So you can get started with downloading TensorFlow code on your laptop and try out with some sample code and then you can move to you models the production level use cases by using GPU. After training the model you can bring the model which consists of tens of megabytes of data that could be ported to the mobile embedded systems.

Neural Network for Mobile

If you want to bring the TensorFlow into your mobile applications there are some challenges you have to face. The neural network is big compared with the other classic machine learning models because deep learning you have to multiple layers.So the total amount of the parameters and amount of the calculation you have to do it can be big for example, the inceptionV3 which is one of the popular image classification models that requires to 91 MB.If you use TensorFlow without any changes by default which consume like 12MB of the binary code. So if you want to bring your mobile applications in productions you don’t want to have users downloading 100 MB. When you’re starting to use your applications you may want to compress everything into the rack at 10-20 MB.So Google has to think about optimization for mobile applications things like pleasing graph quantization memory mapping and selective registration.

Freeze Graph

Freezing graph means that you can remove the all the variables from the TensorFlow graph and convert it into the constants.TensorFlow has the weights and biases so the parameters inside neural networks as a variable because you want to train the model you want to train the neural network in its training data but once you have finish training you don’t have to those parameters in the variable you can put everything into constant.So that by converting from variables to constants you can get much faster learning time.

Quantization in TensorFlow

Quantization is another optimization you can take for the mobile app.Quantizations means that you can compress the precision of each variable in parameters, weights, and biases into fewer operations.For example, by default, TensorFlow use the 32-bit floating point numbers for representing any weights and biases.But by using quantization, you can compress that into 8-bit integer.By using 8-bit integer, you can shrink the size of the parameters much, much smaller and especially for the embedded systems or mobile systems.It’s important to use the integer numbers rather than the floating point numbers to do all the calculations such as multiplications and additions between matrices and vectors because hardware for floating point requires much larger footprint in implementation.So TensorFlow already provides your primitive datatypes for supporting quantization of parameters and operations quantizing, and de-quantizing, or operations that support the quantized variables.

What is TensorFlow Lite?

We know that machine learning adds great power to your mobile application.So with great power comes great responsibility.TensorFlow Lite is a lightweight ML library for mobile and embedded devices.TensorFlow works well on large devices and TensorFlow Lite works really well on small devices. So that it’s easier and faster and smaller to work on mobile devices.

What is different between TensorFlow mobile and TensorFlow Lite?

You should view TensorFlow Lite as an evolution of TensorFlow mobile. TensorFlow Lite is like the next generation.This is created to be really small in size and opt for smaller devices.

TensorFlow Lite came up with three goals.It wanted to have a very small memory and binary size.So even without selective registration.It wants to keep the binary size small and it wants to make sure that the overhead latency is also really small you really can’t 30 seconds for an inference to happen by the time that model is downloaded and processes and quantization is the first-class citizen.It support quantization and many of the model’s support are quantized models.

TensorFlow Lite Architecture
TensorFlow Lite architecture

This is the high-level architecture as you can see it’s a simplified architecture and works both for Android and ios.This is lightweight performs better and leverages hardware acceleration if available.

So to better understanding how to write a model let’s consider how to build a model using TensorFlow Lite.There are two aspects one is the workstation side and other one is the mobile side and let’s walk through the complete lifecycle.
TensorFlow Lite lifecycle

The first step is to decide what model you want to use. So if you want to use their already pre-trained model then you can skip this step because you’ve already done the model generation. One option is to use a pre-trained model the other option would be to retrain just the last layers like you did in the post. You can write your own custom model and train and generate a graph this is nothing specific to TensorFlow Lite this is as good as standard TensorFlow where you build a model and generate graph depths and checkpoints.

The next step is specific to TensorFlow Lite is to convert the generated model into a format the TensorFlow Lite understands.A prerequisite to converting it is to freeze graph.So checkpoints have the weight the graphdef has the variables and tensors freezing the graph is a step where you combine these two results and feed it to your converter the converter is provided as part of the TensorFlow Lite software.You can use this to convert your model into the format that we need. Once this step is completed the conversion step is completed you will have what is called as a .lite binary file.

So now you have a means to move the model to the mobile side.You feed this TensorFlow Lite model into the interpreter.The interpreter executes the model using a set of operators.It supports selective operator loading and only and without this operator it’s only about 70KB and with all the operators it’s about 300KB so you can see how small the minor resize this is a significant reduction from what the TensorFlow is which is over 1 MB at this point so you can also implement custom kernels using the API.If the interpreter is running a CPU then this can be executed directly on the CPU otherwise if there is hardware acceleration then it can be executed on the hardware accelerated hardware as well.

Components of TensorFlow Lite

TensorFlow Lite ComponentThe main components of TensorFlow are the model file format, the interpreter for processing the graph, a set of kernels to work to or where the interpreter can invoke a set of kernels, and lastly an interface to the hardware acceleration layer.TensorFlow Lite has a special model file formate and this is lightweight and has very few dependencies and most graph calculations are done using 32-bit float, but neural networks are trained to be robust for noise and this allows us to explore lower precision numeric the advantages of lower precision numeric is lower memory and faster computation and this is vital for mobile and embedded devices this using lower precision can result in come amount of accuracy loss. So depending on the application you want to develop you can overcome this and use quantization lost in your training. So you can get better accuracy.So quantization is supported as the first class citizen in TensorFlow Lite.TensorFlow has also FlatBuffer base system so we can have the speed of execution.


FlatBuffer is an opensource Google project and it’s comparable to protocol buffers but much faster to use it’s much more memory efficient and in the past when we developed of application we always thought about optimizing for CPU instructions but now CPU are moved far ahead and writing something efficient for memory is more important today. So this is a FlatBuffers is a cross-platform serialization library and it is similar to protobufs but it is designed to be more efficient that you don’t need to you can access them without unpacking and there is no need for secondary representation before you access the data. So this is aimed for speed and efficiency and it is strongly typed so you can find errors at compile time.


The interpreter is engineered to be lower work with low overhead and on very small devices. TensorFlow Lite has very few dependencies and it is easy to build on simple devices.TensorFlow Lite kept the binary size of 70KB and 300KB with operators.

It uses FlatBuffers. So it can load really and the speed comes at the cost of flexibility.TensFolw Lite support only a subset of operators that TensorFlow has. So if you are building a mobile application and if the operators are supported by TensorFlow Lite then the recommendation is use TensorFlow Lite but if you are building a mobile application that is not supported by TensorFlow Lite yet then you should use TensorFlow mobile but be going forward all developer we are going to be using TensorFlow Lite as the main standard.


It has support for operators and used in some common inference models.The set of operators are smaller.Every model will be not supported them, in particular, TensorFlow Lite provides a set of core built-in ops and these have been optimized for arm CPU using neon and they work in both float and quantized. These have been used by Google apps and so they have been battle tested and Gooogle has done the handoff. Google has hand optimized for many common patterns and it has fused many operations to reduce the memory bandwidth.If there are ops that are unsupported it also provides a C API so you could use custom operators and you can write your own operators for this.

4.Interface to Hardware Acceleration

It targets custom hardware.It is the neural network API TensorFlow lite comes pre-loaded with hooks for neural network API if you have an Android release that supports NN API then tensor flow lite will delegate these operators into NN API and if you have an Android release that does not support NN API it’s executed directly on the CPU.

Android Neural Network API

Android Neural Network API is supported for Android with 8.1 release in Oreo.It will support various hardware acceleration you can get from vendors for GPU for DPS and CUP.It uses TensorFlow as a core technology. So, for now, you can keep using TensorFlow to write your mobile app and your app will get the benefits of hardware acceleration through your NN API. It basically abstracts the hardware layer for ML inference for example if a device has ML DSP it can transparently map to it and it uses NN primitives that are very similar to TensorFlow Lite.

android neural network architecture

So It’s architecture for neural network API’s looks like this essentially there’s an android app. On top typically there is no need for the Android app to access the neural network API directly it will be accessing it through the machine learning interface which is the TensorFlow Lite interpreter and the NN runtime. The neural network runtime can talk to the hardware abstraction layer and then which talks to their device and run various accelerators.


Related Post

Image Classify Using TensorFlow Lite

Introduction TensorFlow Machine Learning Library

Install TensorFlow

Train Image classifier with TensorFlow

Train your Object Detection model locally with TensorFlow

Android TensorFlow Machine Learning


How to use DateTime datatype in SQLite Using Room

One of the most interesting and confusing data types that SQLite not supports is Date and Time. I see more questions in online public discussion forums about this type than any other. In this article, I shed light on some very confusing issues regarding select query using Date.

Date and Time Datatype in SQLite

SQLite does not have a storage class set aside for storing dates and/or times. Instead, the built-in Date And Time Functions of SQLite are capable of storing dates and times as TEXT, REAL, or INTEGER values:

  • TEXT as ISO8601 strings (“YYYY-MM-DD HH:MM:SS.SSS”).
  • REAL as Julian day numbers, the number of days since noon in Greenwich on November 24, 4714 B.C. according to the proleptic Gregorian calendar.
  • INTEGER as Unix Time, the number of seconds since 1970-01-01 00:00:00 UTC.

Applications can chose to store dates and times in any of these formats and freely convert between formats using the built-in date and time functions.

Using type converters

Sometimes, your app needs to use a custom data type, like Datetime whose value you would like to store in a single database column. To add this kind of support for custom types, you provide a TypeConverter, which converts a custom class to and from a known type that Room can persist.

For example, if we want to persist instances of Date, we can write the following TypeConverter to store the equivalent Text in the database:

The preceding example defines 2 functions, one that converts a Date object to a String object and another that performs the inverse conversion, from String to Date. Since Room already knows how to persist String objects, it can use this converter to persist values of type Date.

Next, you add the @TypeConverters annotation to the Field of class class so that Room can use the converter that you’ve defined for each Row in entity.

Note:You can also limit the @TypeConverters to different scopes, including individual entities, DAOs, and DAO methods.

1.SQLite Query to select data between two date

I have a start_date and end_date.I want to get the list of dates in between these two dates. Put those two dates between single quotes like..

2.SQLite Query to compare date

3.SQLite Query to group by Year

4.SQLite Select data for a specific year

5.SQLite Query to get Last month Data

6.SQLite Query to Order by Date

7.SQLite Query to calculate age from birth date


Download this project from GitHub.


Related Post

Room Persistence Library

Room: Database Relationships

Room database Migrating


ConstraintLayout 1.1.0: Circular Positioning

Android just published ConstraintLayout 1.1.0 beta 3 on the google maven repository.One of the more interesting additions in this release is Circular Positioning. Circular positioning allows you to constrain a widget center relative to another widget center, at an angle and a distance. This allows you to position a widget on a circle.

ConstraintLayout Circular constraints

Add ConstraintLayout to your project

To use ConstraintLayout in your project, proceed as follows:

1.Ensure you have the repository declared in your project-level build.gradle file:

2.Add the library as a dependency in the same build.gradle file:

Example Circular positioning

The following attributes can be used:

  • layout_constraintCircle : references another widget id.
  • layout_constraintCircleRadius: the distance to the other widget center
  • layout_constraintCircleAngle : which angle the widget should be at (in degrees, from 0 to 360).


Related Post

New features in Constraint Layout 1.1.0


Autosizing TextViews Using Support Library 26.0

Material design recommends using a dynamic type text instead of smaller type sizes or truncating large size text.Android making this much easier to implement with the introduction of TextView auto-sizing.With Android O and Support Library 26.0, TextView gains a new property auto-size text type which allows you to optimize the text size when working with dynamic content.

Autosizing TextViews

Adding support library dependency

The Support Library 26.0 provides full support to the auto sizing TextView feature on devices running Android versions prior to Android 8.0 (API level 26). The package contains the TextViewCompat class to access features in a backward-compatible fashion.

Support Library 26 has now been moved to Google’s maven repository, first include that in your project level build.gradle.

Add the support library in your app level build.gradle.

Enable Autosizing

To enable auto-size in XML, set autoSizeTextType to uniform.This scales the text uniformly on horizontal and vertical axes, ignoring the text size attribute.When using support Library, make sure you use the app namespace.Note that you shouldn’t use wrap_content for layout width or layout height for a textView set to auto-size since it may produce unexpected results.Instead, use match_prent or a fixedsize.

Turn off auto-sizing by selecting none instead of uniform. You can also use auto-size programmatically like this.

Provide an instance of the TextView widget and one of the text types, such asTextViewCompat.AUTO_SIZE_TEXT_TYPE_NONE or TextViewCompat.AUTO_SIZE_TEXT_TYPE_UNIFORM.

Customize TextView

If you want to customize your TextView more, it has some extra attributes for you to auto-size min and max text size and step granularity.The TextView will scale uniformly in the range between the minimum and the maximum size in increments of step granularity.If you don’t set these properties, the default values will be used.

To define a range of text sizes and a dimension in XML, use the app namespace and set the following attributes:


To define a range of text sizes and a dimension programmatically, call the setAutoSizeTextTypeUniformWithConfiguration(int autoSizeMinTextSize, int autoSizeMaxTextSize, int autoSizeStepGranularity, int unit) method. Provide the maximum value, the minimum value, the granularity value, and any TypedValue dimension unit.

Preset Sizes

To have more control over the final size, for example, your app needs to comply with specific text size design guidelines, you can provide a list of size, and it will use the largest one that fits.

Create an array with the size in your resources and then set the auto-size present sizes attribute in the XML.

To use preset sizes to set up the auto-sizing of TextView programmatically through the support library, call theTextViewCompat.setAutoSizeTextTypeUniformWithPresetSizes(TextView textView, int[] presetSizes, int unit) method. Provide an instance of the TextView class, an array of sizes, and any TypedValue dimension unit for the size.



How to Create Instant app from Existing App

Android Instant Apps allows Android users to run your apps instantly, without installation. Users can get to your flagship Android experience from any URL—including search, social media, messaging, and other deep links—without needing to install your app first.Android Instant Apps supports the latest Android devices from Android 6.0 through Android O. Google will be rolling out to more devices soon, including expanding support to Android 5.0 (API level 21) devices shortly.

How does Instant app work?

When Google Play receives a request for a URL that matches an instant app, it sends the necessary code files to the Android device that sent the request. The device then runs the app.

How does Instant app work

Structure of the Instant App

  • Base feature module: The fundamental module of your instant app is the base feature module. All other feature modules must depend on the base feature module. The base feature module contains shared resources, such as activities, fragments, and layout files. When built into an instant app, this module builds a feature APK. When built into an installed app, the base feature module produces an AAR file.
  • Features: At a very basic level, apps have at least one feature or thing that they do: find a location on a map, send an email, or read the daily news as examples. Many apps provide multiple features.
  • Feature ModulesTo provide this on-demand downloading of features, you need to break up your app into smaller modules and refactor them into feature modules.
  • Feature APKs: Each feature APK is built from a feature module in your project and can be downloaded on demand by the user and launched as an instant app.


Each feature within the instant app should have at least one Activity that acts as the entry-point for that feature. An entry-point activity hosts the UI for the feature and defines the overall user flow. When users launch the feature on their device, the entry-point activity is what they see first. A feature can have more than one entry-point activity, but it only needs one.

Structure of Instant App


As you see in the figure, both “Feature 1” and “Feature 2” depend on the base feature module. In turn, both the instant and installed app modules depend on the feature 1 and feature 2 modules. All three feature modules are shown in figure -base feature, feature 1, and feature 2—have the plugin applied to their build configuration files.

Upgrade Your Existing App

Android Instant Apps functionality is an upgrade to your existing Android app, not a new, separate app. It’s the same Android APIs, the same project, and the same source code. Android Studio provides the tools you need to modularize your app so that users load only the portion of the instant app that they need when they need it.

Step 1: Develop a use case for your instant App

Focus on a core user experience that completes a specific action and optimizes a key business metric besides app installs. Then review the user experience guidelines for Android Instant Apps.

Step 2: Set up your development Environment

To develop an instant app, you need the following:

Install Instant App SDK

Build an Android Instant App, we need to install the SDK. Go to Tools >Android > SDK Manager. Click on the “SDK Tools” tab and install “Instant Apps Development SDK” by checking the box and hitting “Apply”

Install Instant App SDK

Set up your device or emulator

You can develop instant apps on the following devices and emulators:

  • Devices: Nexus 5X, Nexus 6P, Pixel, Pixel XL, Galaxy S7 running Android 6.0 or higher.
  • Emulator: Nexus 5X image running Android 6.0 (API level 23), x86, with Google APIs(You cannot use x86_64 architectures).

Step 3: Moving existing code into a feature module

In this step, we will convert the existing application module into a shareable feature module. We will then create a minimal application module that has a dependency on the newly formed feature. Note that this feature module will be included into the Instant App build targets later. 

Convert the app module into a feature module called app-base

We start with renaming the module from 'app' to 'app-base':

create base module

Change Module type

Next, we change the module type to Feature module by changing the plugin type from to  and also remove applicationId   because this is no longer an application module in the app-base/build.gradle file:

Specify base feature in the project  app-base/build.gradle

Synchronize gradle files and re-build the project with Build->Rebuild Project.

Step 4: Create appapk module to build APK file

Now that we have transformed our source code into a reusable library module, we can create a minimal application module that will create the APK. From File->New Module

Create New Module

Enter application name “app apk”, leave suggested module name (topekaapk).

If your project uses Data Binding, you need to ensure that appaapk/build.gradle includes the following in the android { ... } section.

Replace compile dependencies in appaapk/build.gradle:

Switch to “Project” view and remove unused files:Instant app remove unused folder

Switch back to “Android view” and remove the application element from appaapk/src/main/AndroidManifest.xml. It should only contain this single manifest element.

Finally sync Gradle files, re-build and run the project. The application should behave exactly the same despite all of our changes.

Create Feature module

We have just moved the application’s core functionality into a shareable feature module and we are now ready to start adding in the Instant App modules.

Step 5: Creating the instant app APK

Instant Apps uses feature APKs to break up an app into smaller, feature focused modules. One way to look at Instant Apps is a collection of these feature APKs. When the user launches a URL, Instant Apps will only deliver the necessary feature APKs to provide the functionality for that URL.

The app-base module feature APK that encompasses the full functionality of our app. We will create an Instant App module that bundles our single feature APK. At the end, we are going to have our single feature instant app!

single feature instant app

The Instant App module is merely a wrapper for all the feature modules in your project. It should not contain any code or resources.

Create an Instant App module

Select File -> New -> New Module
Create Instant App

Next, we need to update the instant app gradle file to depend on the base feature module. At this point we also need to add buildToolsVersion to explicitly use 26.0.1 since there is an issue in the current Android preview which makes the default 26.0.0. Since we’ve installed 26.0.1 with the project we don’t want to fetch 26.0.0 necessarily.


The instant app does not hold any code or resources.It contains only a build.gradle file.

Now do a clean rebuild: Build -> Rebuild project.

Step 6: Defining App Links

App links are required in instant apps because URLS are the only way a user can launch an instant app (instant apps are not installed).Each entry-point activity in an instant app needs to be addressable: it needs to correspond to a unique URL address. If the URL addresses for the features in an instant app share a domain, each feature needs to correspond to a different path within that domain.

In order to enable Instant App runtime to call your application, we must establish the relationship between your web site and the app. To associate links, we will use a new feature built into Android Studio called “App Links Assistant.”. Invoke this tool through the Android Studio “Tools” menu:

App Links Assistant

From the sidebar, tap on the “Open URL Mapping Editor” button:

add url intent filters

From the editor, tap the “+” button to add a new app link entry:

Create a new URL mapping with the following details:


Path: pathPattern/signin

Activity: activity.SigninActivity (app-base)

map url to activity

Repeat this dialog for https variation, as well as other links:

In the end, you should have 4 mappings like this:

Url to Activity mapping

Sync gradle files if required and rebuild the project.

Again, since we have not used Android Studio wizard, the run configuration for instant app is not valid, we need to define the URL before we can launch the instant app from the IDE.

Click the Run configuration dropdown and choose “Edit Configurations…”

Select instantapp under Android App.

Edit Configuration

Replace the text ‘<< ERROR – NO URL SET>>’ with

To run your Instant App, select instantapp from the Run configuration dropdown and click Run:

Run Instant APP


Now, you have created and deployed an Android Instant App. You took an existing Android application and restructured it to build both a full installed APK and an Instant APK that will be loaded when the user taps on the associated URLs.


You may need to separate the code in multiple features for various reasons, the most obvious one is feature separation, to let users download only the relevant portions of the app.We will create another feature module and move all UI code there (activities and related fragments). It will let us create two features (

You will create another feature module and move all UI code there (activities and related fragments). It will create two features (app-base and app-ui) later.



EmojiCompat Support Library

Have you seen☐this blank square character called “tofu” when an app can’t display an emoji? New emojis are constantly being added to the Unicode standards, but since they are bundled as a font, your phone’s emojis are set in stone with each OS release.Well, they were.

With the EmojiCompat library(part of the Support Library 26) your app can get backward-compatible emoji support on devices with API level 19+ and get rid of tofu.

How does EmojiCompat work?

EmojiCompat Process

For a given char sequence, EmojiCompat can identify the emojis, replace them with the EmojiSpan, and then render the glyphs.On versions prior to API level 19, you’ll still get the tofu characters.

EmojiCompat build on the new font mechanism to make sure you always have the latest emoji available.

Downloadable fonts configuration

The downloadable fonts configuration uses the Downloadable Fonts support library feature to download an emoji font. It also updates the necessary emoji metadata that the EmojiCompat support library needs to keep up with the latest versions of the Unicode specification.

Adding support library dependency

Support Library 26 has now been moved to Google’s maven repository, first include that in your project level build.gradle.

Add the support library in your app level build.gradle.

Adding certificates

When a font provider is not preinstalled or if you are using the support library, you must declare the certificates the font provider is signed with. The system uses the certificates to verify the font provider’s identity.

Create a string array with the certificate details.

Initializing the downloadable font configuration

Before using EmojiCompat, the library needs a one-time asynchronous setup(in application class).

When a downloadable font configuration, create your FontRequest, and the FontRequestEmojiCompatConfig object.

It depends on the way you’re using it, the initialization takes at least 150 ms, even up to a few seconds. So you might want to get notified about its state.for this, use the registerInitCallback method.

Use EmojiCompat widgets in layout XMLs. If you are using AppCompat, refer to the Using EmojiCompat widgets with AppCompat section.

You can preprocess a char Sequence using the process method. You can then reuse the result instead of the initial registering in any widget that can render spanned instances.So, for example, if you’re doing your own custom drawing, you can use this to display emoji text.


Download this project from GitHub

Fast Scrolling in RecyclerView Using Support Library 26

InListView, you could have a fast scroller which allowed you to drag a scrollbar to easily scroll to wherever you wished using fastScrollEnabled attribute.With Support Library 26, you can also easily enable fast scrolling for RecyclerView.

Fast Scrolling RecyclerView

 Add Dependency

Support Library 26 has now been moved to Google’s maven repository, first include that in your project level build.gradle.

Add Support Library 26 in your app level build.gradle.

Enable Fast Scrolling

If fastScrollEnabled boolean flag for RecyclerView is enabled then,fastScrollHorizontalThumbDrawable,fastScrollHorizontalTrackDrawable, fastScrollVerticalThumbDrawable, and fastScrollVerticalTrackDrawable must be set.

  • fastScrollEnabled : boolean value to enable the fast scrolling. Setting this as true will require that must provide the following four properties.
  • fastScrollHorizontalThumbDrawable : A StateListDrawable that will be used to draw the thumb which will be draggable across the horizontal axis.
  • fastScrollHorizontalTrackDrawable : A StateListDrawable that will be used to draw the line that will represent the scrollbar on horizontal axis.
  • fastScrollVerticalThumbDrawable : A StateListDrawable that will be used to draw the thumb which will be draggable on vertical axis.
  • fastScrollVerticalTrackDrawable : A StateListDrawable that will be used to draw the line that will represent the scrollbar on vertical axis.


Now, Create native shapes StateListDrawables.






Now Create RecyclerView and enable fastScrollEnabled.


Download this project from GitHub




Flutter Shared preferences

If you have a relatively small collection of key-values that you’d like to save, you should use the SharedPreferences Plugin. A SharedPreferences object points to a file containing key-value pairs and provides simple methods to read and write them.

Flutter Shared preferences plugin Wraps NSUserDefaults (on iOS) and SharedPreferences (on Android), providing a persistent store for simple data. Data is persisted to disk automatically and asynchronously.

To use this plugin, add shared_preferences as a dependency in your pubspec.yaml file.

1.Add this to your package’s pubspec.yaml file:

2. You can install packages from editor might  ‘packages get’


Shared preferences plugin

3. Import it

Write to Shared Preferences

To write to a shared preferences file, create a SharedPreferences by calling getInstance on your SharedPreferences.

Pass the keys and values you want to write with methods such as setInt() and setString().For example:

Read from Shared Preferences

To retrieve values from a shared preferences file, call methods such as getInt() and getString(), providing the key for the value you want, and optionally a default value to return if the key isn’t present. For example:

Related Post

Flutter ListView with Image and Checkbox


Flutter ListView with Image and Checkbox

In this tutorial, you will learn how to implement basic single line ListView with Image and Checkbox.First, Create new Flutter Project in your IntelliJ Idea.

1.Adding Assets and Images in Flutter

Flutter apps can include both code and assets. An asset is a file that is bundled and deployed with your app and is accessible at runtime.

Specifying assets

Flutter uses the pubspec.yaml file, located at the root of your project, to identify assets required by an app.

The assets subsection of the flutter section specifies files that should be included with the app. Each asset is identified by an explicit path (relative to the pubspec.yaml file) where the asset file is located. The order in which the assets are declared does not matter.

Add assets image in flutter

The main asset is assumed to correspond to a resolution of 1.0. For example, consider the following asset layout for an image named person.png:

  • …/person.png
  • …/2.0x/person.png
  • …/3.0x/person.png

On devices with a device pixel ratio of 1.8, the asset …/2.0x/person.png would be chosen. For a device pixel ratio of 2.7, the asset …/3.0x/person.png would be chosen.

2.Create Presentation Class

We’ll create purchase application, which displays various products offered for and maintains a purchase list. Let’s start by defining our presentation class, Product:

3.Create List  Item

ListTiles are always a fixed height (which height depends on how isThreeLine, dense, and subtitle are configured); they do not grow in height based on their contents. If you are looking for a widget that allows for arbitrary layout in a row, consider Row.

The ShoppingListItem widget follows a common pattern for stateful widgets. It stores the values it receives in its constructor in member variables, which it then uses during its function.

4.Create Listview

The parent creates a new instance of ShoppingListItem when it rebuilds, that operation is cheap because the framework compares the newly built widgets with the previously built widgets and applies only the differences to the underlying render objects.

Let’s look at an example parent widget that stores mutable state:

The ShoppingList class extends StatefulWidget, which means this widget stores mutable state. When the ShoppingListwidget is first inserted into the tree, the framework calls the createState function to create a fresh instance of _ShoppingListState to associate with that location in the tree.

To access properties of the current ShoppingList, the _ShoppingListState can use its widget property. If the parent rebuilds and creates a new ShoppingList, the _ShoppingListState will also rebuild with the new widget value. If you wish to be notified when the widget property changes, you can override the didUpdateWidget function, which is passed oldWidget to let you compare the old widget with the current widget.

Flutter ListView With Checkbox

Download this project from GitHub