Flutter Shared preferences

If you have a relatively small collection of key-values that you’d like to save, you should use the SharedPreferences Plugin. A SharedPreferences object points to a file containing key-value pairs and provides simple methods to read and write them.

Flutter Shared preferences plugin Wraps NSUserDefaults (on iOS) and SharedPreferences (on Android), providing a persistent store for simple data. Data is persisted to disk automatically and asynchronously.

To use this plugin, add shared_preferences as a dependency in your pubspec.yaml file.

1.Add this to your package’s pubspec.yaml file:

2. You can install packages from editor ‘packages get’

Shared preferences plugin

3. Import it

Write to Shared Preferences


To write to a shared preferences file, create a SharedPreferences by calling getInstance on your SharedPreferences.

Pass the keys and values you want to write with methods such as setInt() and setString().For example:

Read from Shared Preferences


To retrieve values from a shared preferences file, call methods such as getInt() and getString(), providing the key for the value you want, and optionally a default value to return if the key isn’t present. For example:

Related Post

Flutter ListView with Image and Checkbox

 

Flutter ListView with Image and Checkbox

In this post, you will learn how to implement basic single line ListView with Image and Checkbox.First, Create new Flutter Project in your IntelliJ Idea.

1.Adding Assets and Images in Flutter


Flutter apps can include both code and assets. An asset is a file that is bundled and deployed with your app and is accessible at runtime.

Specifying assets

Flutter uses the pubspec.yaml file, located at the root of your project, to identify assets required by an app.

The assets subsection of the flutter section specifies files that should be included with the app. Each asset is identified by an explicit path (relative to the pubspec.yaml file) where the asset file is located. The order in which the assets are declared does not matter.

Add assets image in flutter

The main asset is assumed to correspond to a resolution of 1.0. For example, consider the following asset layout for an image named person.png:

  • …/person.png
  • …/2.0x/person.png
  • …/3.0x/person.png

On devices with a device pixel ratio of 1.8, the asset …/2.0x/person.png would be chosen. For a device pixel ratio of 2.7, the asset …/3.0x/person.png would be chosen.

2.Create Presentation Class


We’ll create purchase application, which displays various products offered for and maintains a purchase list. Let’s start by defining our presentation class, Product:

3.Create List  Item


ListTiles are always a fixed height (which height depends on how isThreeLine, dense, and subtitle are configured); they do not grow in height based on their contents. If you are looking for a widget that allows for arbitrary layout in a row, consider Row.

The ShoppingListItem widget follows a common pattern for stateful widgets. It stores the values it receives in its constructor in member variables, which it then uses during its function.

4.Create Listview


The parent creates a new instance of ShoppingListItem when it rebuilds, that operation is cheap because the framework compares the newly built widgets with the previously built widgets and applies only the differences to the underlying render objects.

Let’s look at an example parent widget that stores mutable state:

The ShoppingList class extends StatefulWidget, which means this widget stores mutable state. When the ShoppingListwidget is first inserted into the tree, the framework calls the createState function to create a fresh instance of _ShoppingListState to associate with that location in the tree.

To access properties of the current ShoppingList, the _ShoppingListState can use its widget property. If the parent rebuilds and creates a new ShoppingList, the _ShoppingListState will also rebuild with the new widget value. If you wish to be notified when the widget property changes, you can override the didUpdateWidget function, which is passed oldWidget to let you compare the old widget with the current widget.

Flutter ListView With Checkbox

Download this project from GitHub

 

Building Your First Flutter App

In this post, you’ll learn how to create a Flutter project with the IntelliJ idea and run a debuggable version of the app. You’ll also learn some fundamentals of Flutter app design, including how to build a simple user interface and handle user input.

Before you start this, install Flutter.

1. Create a Flutter Project


  1. File>New Project.
  2. Select Flutter application as the project type
  3. Enter a project name
  4. Click Finish
    Flutter Hello World

2.Create Widget


Your entire app is going to be made out of widgets. Your app is one huge widget, which has sub-widgets, which has sub-widgets, all the way down. Widgets are just like components.

Create Stateless widget

A stateless widget is a widget that describes part of the user interface. The stateless widget is useful when the part of the user interface you are describing does not depend on anything other than the configuration information in the object itself.

Create Scaffold widget

Scaffold Implements the basic material design visual layout structure. This class provides APIs for showing drawers, snack bars, and bottom sheets.

AppBar

An app bar consists of a toolbar, a TabBar and a FlexibleSpaceBar.App bars are typically used in the Scaffold.AppBar property, which places the app bar as a fixed-height widget at the top of the screen.

The AppBar widget lets us pass in widgets for the leading and the actions of the title widget.

Using Material Design

A Material Design app starts with the MaterialApp widget, which builds a number of useful widgets at the root of your app, including a Navigator, which manages a stack of widgets identified by strings, also known as “routes”. The Navigator lets you transition smoothly between screens of your application. Using the MaterialApp widget is entirely optional but a good practice.

Flutter Hello World

Now, AppBar and Scaffold widgets from material.dart, our app is starting to look at a bit more like Material Design. For example, the app bar has a shadow and the title text inherits the correct styling automatically. We’ve also added a floating action button for good measure.

 

 

Firebase JobDispatcher

If you have some heavy work to do and it doesn’t need to be done right now, then I’m going to suggest you use to use Firebase JobDispatcher.It is guaranteed to get your job done. It operates at the System level.

It can use several factors to schedule your background work to run the jobs from other apps.we can minimize things like radio use.It doesn’t perform work based on time but based on conditions.For example, you can use setRequiredNetworkType for jobs.
That you want to execute when the user is on a metered vs an unmetered connection or you can call setLifeTime for jobs that you want to persist across a potential reboot.

Installation


Add the following to your app build.gradle‘s dependencies.check the latest version

Create Job

You can define conditions when you are creating the job through the Job object.


Create a new JobService

Your job service is actually going to be a service that extends the JobService class, and this is where you’ll define the work that you’ll be doing.You will need to implement methods.

onStartJob is called by the system when it is time for your job to execute.Your JobService runs on the main thread.So use onStartJob to either perform simple work And if you do kick off something else, you’ll need to return true.If you’re done with everything, go ahead and return false.This will let the system know whether your job has any ongoing work still.

onStopJob is called by the system if your job is canceled before being finished because the conditions are no longer being met like the device has been unplugged.So use this for safety checks and clean up, and then return true if you’d like the system to reschedule the job or false if it doesn’t matter and the job will be dropped.

jobFinished take two parameters, the current job, so that it knows which one we are talking about and a boolean indicating whether you’d like to reschedule the job. Perhaps your work failed for some reason.So this will kick off the JobScheduler’s essential backoff logic for you or else the logic you specified in Job.

Now, as with any service, you’ll need to add this one to your AndroidManifest.xml. What’s different, though, is that you need to add a permission that will allow JobScheduler to call your jobs and be the only one that can access your jobService.

Finally, you can schedule a job using FirebaseJobDispatcher, which you will get from the system, then call schedule using that super perfect JobInfo object you create, and you are good to go.

Conclusion

There are a lot of pieces to be sure, and you’ll need to think carefully about when and what should trigger your job and what happens if it fails for some reason.But overall FirebaseJobDispatcher was designed to be easy to work with.So give it a try and go build batter apps.

Firebase Performance Monitoring for Android

Your user is on a wide variety of devices, in a wide variety of networks, in a wide variety of locations all over the world. If you want to optimize the performance of your app, metrics that tell you exactly what’s happening during the critical moments of your app’s use. You need that information to come directly from users.

Now, You can get using Firebase Performance Monitoring. By integrating the SDK into your app, and without writing any code, your performance dashboard in the Firebase console will collect information about your app’s performance, as seen by your users. You’ll get data about your app’s startup time and details about its HTTP transactions. Using the provided API, you can instrument your app to measure those critical moments that you want to understand and improve. Then, in the dashboard, you can break down the data by Country, Version, Device, OS, Radio, Carrier, MIME type.

Install Firebase SDK in APP


This guide shows you how to use Firebase Performance Monitoring with your app.

1 Prerequisites

You need a few things set up in your environment:

  • A device running Android 4.0 +, and Google Play services 11.0.4 +
  • The Google Play services SDK from the Google Repository, available in the Android SDK Manager
  • The latest version of Android Studio 2.2+

2 Add Firebase to your Android project


Add Firebase to your app from Android Studio using the Firebase Assistant.To open the Firebase Assistant in Android Studio:

  1. Click Tools > Firebase to open the Assistant window.
  2. Click to expand one of the listed features (for example, Analytics), then click the provided tutorial link (for example, Log an Analytics event).
  3. Click the Connect to Firebase button to connect to Firebase and add the necessary code to your app.

Firebase Performance Monitor

3 Add Performance Monitoring to your app


  • Add the following dependencies project-level build.gradle : check the latest version

Add the following dependencies app-level build.gradle :

  1. Below apply plugin: 'com.android.application', add the following line:
  2. Add the following to the dependencies section:
  3. If your app uses other Firebase SDKs, you should also change the version to 11.0.4 for those SDKs.

 

Recompile your app. Automatic traces and HTTP/S network requests are now monitored.

If you just install the SDK, you still get a few out-of-the-box traces, the most important one being app start. This is the time between the app code start loading until the app is responsive to the user. Firebase also has a trace that monitors the foreground and background as well.

App code performance.

What kind of delays your user see as they interact with the app, or how much frame drops they see in the animation to monitor that Firebase built feature called Traces.

The other category of issues is about network activity between your app and the backends that it uses for that Firebase monitor network requests.

Traces


The easiest way to define a trace is its performance report between two points in your app.

All you need to do is create a trace name or a trace object, give it a name, start it and stop it. That name you give it becomes your context.

Add the @AddTrace annotation to trace specific methods

You can add the @AddTrace annotation to methods in your app and provide a string to identify the resulting trace. This causes a trace to start at the beginning of this method, and to stop when the method completes. Traces created in this way do not have counters available.

Trace Metrics

In terms of metrics, by default you get the trace duration, so how much time took place between the start from the stop. You can attach custom metrics using counter API. Which you see an example of here.

Anyone of these events you can just give it a name, You can count it increment it and we’ll tally up those events and report them as custom metrics attached to your trace.

Network Monitor


An HTTP/S network request is a report that captures the time between when your app issues a request to a service endpoint and when the response from that endpoint is complete.

  • Response time: Time between when the request is made and when the response is fully received
  • Payload size: Byte size of the network payload downloaded and uploaded by the app
  • Success rate: Percentage of successful responses compared to total responses (to measure network or server failures)

 

Related Post

Firebase Crashlytics

 

 

Train your Object Detection model locally with TensorFlow

In this post, we’re going to train machine learning models capable of localizing and identifying multiple objects in an image. You’ll need to install TensorFlow and you’ll need to understand how to use the command line.

Tensorflow Object Detection API


The TensorFlow Object Detection API built on top of TensorFlow that makes it easy to construct, train and deploy object detection models.

This post walks through the steps required to train an object detection model locally.

1. Clone Repository


you can download directly ZIP file.

2.Installation


Tensorflow Object Detection API depends on the following libraries.

  • Protobuf 2.6
  • Pillow 1.0
  • Lxml

1.Protobufs to configure model and training parameters. Before the framework can be used, the Protobuf libraries must be compiled. This should be done by running the following command from the tensorflow/models directory:

Add Libraries to PYTHONPATH

When running locally, the tensorflow/models/ and slim directories should be appended to PYTHONPATH. This can be done by running the following from tensorflow/models/:

Note: This command needs to run from every new terminal you start. If you wish to avoid running this manually, you can add it as a new line to the end of your ~/.bashrc file.

Testing the Installation

You can test that you have correctly installed the Tensorflow Object Detection API by running the following command:

Above command generate following output.

Install Object Detection API

3.Preparing Inputs


Tensorflow Object Detection API reads data using the TFRecord file format. Two sample scripts (create_pascal_tf_record.py and create_pet_tf_record.py) are provided to convert dataset to TFRecords.

Directory Structure for Training input data

  • To prepare the input file for the sample scripts you need to consider two things. Firstly, you need an RGB image which is encoded as jpg or png and secondly, you need a list of bounding boxes (xmin, ymin, xmax, ymax) for the image and the class of the object in the bounding box.
  • Here is a subset of the pet image data set that I collected in images folder:

 

Afterward, labeled them manually with LabelImg. LabelImg is a graphical image annotation tool that is written in Python. It’s super easy to use and the annotations are saved as XML files.Save image annotations xml in /annotations/xmls folder.

Image Annotation

Create trainval.txt in annotations folder which content name of the images without extension.Use the following command to generate trainval.txt.

Label Maps

Each dataset is required to have a label map associated with it. This label map defines a mapping from string class names to integer class Ids.Label maps should always start from id 1.Create label.pbtxt file with the following label map:

Generating the Pet TFRecord files.

Run the following commands.

You should end up with two TFRecord files named    pet_train.record and pet_val.record in the tensorflow/modelsdirectory.

4.Training the model


After creating the required input file for the API, Now you can train your model.For training, you need the following command:

An object detection training pipeline also provide sample config files on the repo. For my training, I used ssd_mobilenet_v1_pets.config basis. I needed to adjust the num_classes to one and also set the path (PATH_TO_BE_CONFIGURED) for the model checkpoint, the train, and test data files as well as the label map. In terms of other configurations like the learning rate, batch size and many more, I used their default settings.

5.Running the Evaluation Job


Evaluation is run as a separate job. The eval job will periodically poll the training directory for new checkpoints and evaluate them on a test dataset. The job can be run using the following command:

where ${PATH_TO_YOUR_PIPELINE_CONFIG} points to the pipeline config, ${PATH_TO_TRAIN_DIR} points to the directory in which training checkpoints were saved (same as the training job) and ${PATH_TO_EVAL_DIR} points to the directory in which evaluation events will be saved. As with the training job, the eval job run until terminated by default.

6.Running TensorBoard


Progress for training and eval jobs can be inspected using TensorBoard. If using the recommended directory structure, TensorBoard can be run using the following command:

where ${PATH_TO_MODEL_DIRECTORY} points to the directory that contains the train and eval directories. Please note it may take TensorBoard a couple minutes to populate with data.

7.Exporting the Tensorflow Graph


After your model has been trained, you should export it to a Tensorflow graph proto. First, you need to identify a candidate checkpoint to export. The checkpoint will typically consist of three files in pet folder:

  1.  model.ckpt-${CHECKPOINT_NUMBER}.data-00000-of-00001
  2. model.ckpt-${CHECKPOINT_NUMBER}.index
  3. model.ckpt-${CHECKPOINT_NUMBER}.meta

Run the following command to export Tensorflow graph.Change the checkpoint number.

Related Post

TenserFlow Lite

Train Image classifier with TensorFlow

Android TensorFlow Machine Learning