Flutter Shared preferences

If you have a relatively small collection of key-values that you’d like to save, you should use the SharedPreferences Plugin. A SharedPreferences object points to a file containing key-value pairs and provides simple methods to read and write them.

Flutter Shared preferences plugin Wraps NSUserDefaults (on iOS) and SharedPreferences (on Android), providing a persistent store for simple data. Data is persisted to disk automatically and asynchronously.

To use this plugin, add shared_preferences as a dependency in your pubspec.yaml file.

1.Add this to your package’s pubspec.yaml file:

2. You can install packages from editor might  ‘packages get’

 

Shared preferences plugin

3. Import it

Write to Shared Preferences

To write to a shared preferences file, create a SharedPreferences by calling getInstance on your SharedPreferences.

Pass the keys and values you want to write with methods such as setInt() and setString().For example:

Read from Shared Preferences

To retrieve values from a shared preferences file, call methods such as getInt() and getString(), providing the key for the value you want, and optionally a default value to return if the key isn’t present. For example:

Related Post

Flutter ListView with Image and Checkbox

 

Flutter ListView with Image and Checkbox

In this tutorial, you will learn how to implement basic single line ListView with Image and Checkbox.First, Create new Flutter Project in your IntelliJ Idea.

1.Adding Assets and Images in Flutter

Flutter apps can include both code and assets. An asset is a file that is bundled and deployed with your app and is accessible at runtime.

Specifying assets

Flutter uses the pubspec.yaml file, located at the root of your project, to identify assets required by an app.

The assets subsection of the flutter section specifies files that should be included with the app. Each asset is identified by an explicit path (relative to the pubspec.yaml file) where the asset file is located. The order in which the assets are declared does not matter.

Add assets image in flutter

The main asset is assumed to correspond to a resolution of 1.0. For example, consider the following asset layout for an image named person.png:

  • …/person.png
  • …/2.0x/person.png
  • …/3.0x/person.png

On devices with a device pixel ratio of 1.8, the asset …/2.0x/person.png would be chosen. For a device pixel ratio of 2.7, the asset …/3.0x/person.png would be chosen.

2.Create Presentation Class

We’ll create purchase application, which displays various products offered for and maintains a purchase list. Let’s start by defining our presentation class, Product:

3.Create List  Item

ListTiles are always a fixed height (which height depends on how isThreeLine, dense, and subtitle are configured); they do not grow in height based on their contents. If you are looking for a widget that allows for arbitrary layout in a row, consider Row.

The ShoppingListItem widget follows a common pattern for stateful widgets. It stores the values it receives in its constructor in member variables, which it then uses during its function.

4.Create Listview

The parent creates a new instance of ShoppingListItem when it rebuilds, that operation is cheap because the framework compares the newly built widgets with the previously built widgets and applies only the differences to the underlying render objects.

Let’s look at an example parent widget that stores mutable state:

The ShoppingList class extends StatefulWidget, which means this widget stores mutable state. When the ShoppingListwidget is first inserted into the tree, the framework calls the createState function to create a fresh instance of _ShoppingListState to associate with that location in the tree.

To access properties of the current ShoppingList, the _ShoppingListState can use its widget property. If the parent rebuilds and creates a new ShoppingList, the _ShoppingListState will also rebuild with the new widget value. If you wish to be notified when the widget property changes, you can override the didUpdateWidget function, which is passed oldWidget to let you compare the old widget with the current widget.

Flutter ListView With Checkbox

Download this project from GitHub

 

Building Your First Flutter App

This post teaches you how to build your first Flutter app. You’ll learn how to create a Flutter project with the IntelliJ idea and run a debuggable version of the app. You’ll also learn some fundamentals of Flutter app design, including how to build a simple user interface and handle user input.

Before you start this, install Flutter.

1.Create a Flutter Project

Flutter Hello World

2.Create Widget

Your entire app is going to be made out of widgets.Your app is one huge widget, which has sub-widgets, which has sub-widgets, all the way down.Widgets are just like components.Flutter is a functional reactive framework and the key idea is you’re using your model data to describe templates for how your view should look at all sorts of different situations.The framework itself is the one that controls how to change between those different views.So It’s a slightly different paradigm for how you think about programming UIs.

Immutable Widgets

Rather than your more typical model view controller setup, where you have a controller that’s running back and forth and syncing these two things, you’ve got your model data describing how your view should look.This means that in practice, it gives you are really nice things like hot reload,But under the covers, while the developer doesn’t have to worry about this,this means that the widgets that are created are immutable and ephemeral and again, while you don’t have to worry about how to change a view with an immutable widget, It helps save the developer from avoiding a lot of common bugs.Like if you have persistent, long-running UI objects, they could get into inconsistent state, and it’s hard to debug.This way, it makes programming dynamic UIs much easier.Your code ends up being very modular, reusable.

1.Create Stateless widget

A stateless widget is a widget that describes part of the user interface.The stateless widget is useful when the part of the user interface you are describing does not depend on anything other than the configuration information in the object itself.

2.Create Scaffold widget

Scaffold Implements the basic material design visual layout structure. This class provides APIs for showing drawers, snack bars, and bottom sheets.

3.Appbar

An app bar consists of a toolbar, a TabBar and a FlexibleSpaceBar.App bars are typically used in the Scaffold.appBar property, which places the app bar as a fixed-height widget at the top of the screen.

The AppBar widget lets us pass in widgets for the leading and the actions of the title widget.

4.Using Material Design

A Material Design app starts with the MaterialApp widget, which builds a number of useful widgets at the root of your app, including a Navigator, which manages a stack of widgets identified by strings, also known as “routes”. The Navigator lets you transition smoothly between screens of your application. Using the MaterialApp widget is entirely optional but a good practice.

Flutter Hello World

Now, AppBar and Scaffold widgets from material.dart, our app is starting to look at bit more like Material Design. For example, the app bar has a shadow and the title text inherits the correct styling automatically. We’ve also added a floating action button for good measure.

 

 

Develop Cross Platform App With Flutter and Dart

Google’s Flutter is a fantastic way to easily develop beautiful mobile apps.Developing app for ios and Android means that your workload has just doubled.Because now you have to write an ios app, and an Android app.You want to spend time on adding cool new features.You don’t want to spend time writing your app twice.Flutter easily solve problem and it’s not doubling our workload.

Flutter allows you to write your code once and have a natural feeling app on both android and ios.Flutter’s development experience is fast and easy,letting you spend less time developing more cool features for your app.

Flutter also has an incredible development process with hot reload,which lets you update your code on the fly without having to restart your entire app.With Flutter Firebase Plugin, you can painlessly integrate with a remote database that user real time sync,you can get usage analytics and you can easily scale up as needed.

Fantastic Flutter Features.

  • Natural look and feel for ios and Android
  • Excellent development experience with hot reload
  • Quick to write!
  • Firebase integration

 

Install Flutter

Getting your system ready to develop Flutter is actually really easy.Just clone the Git repository,and then you add Flutter your path.Once that’s setup,there’s a really cool tool called Flutter Doctor that you run and it will check your system for dependencies to see if there are any remaining dependencies you need to install. Examples of this are if you’re doing ios development,you want to install developer tools,or you want make sure that you have java installed for Android development. Of course, you don’t have to have both of these installed,but you probably want to have one,because you are developing mobile app in some capacity.

Install the latest Android SDK and Android SDK Build-Tools, which are required by Flutter when developing for Android.

1.Clone the repo

The Dart SDK is bundled with Flutter,it is not necessary to install Dart separately.Clone the repository and then add the flutter tool to your path:

2.Run flutter doctor

Following command checks your environment and displays a report to the terminal window. . Check the output carefully for other software you may need to install or further tasks to perform (shown in bold text).

The first time you run the flutter command, it downloads its own dependencies and compiles itself. Subsequent runs should be much faster.

IDE Setup

IntelliJ has really nice plugins for Flutter and Dart.It gives you auto completion nice debugging support,and you can run your devices from IntelliJ.

1.Installing IntelliJ

You can use the IntelliJ plug-ins with one of the following JetBrains IDEs:

Android Studio (and various other JetBrains editors) is currently not supported.

2.Installing the plugins

To use IntelliJ with Flutter, you need two plugins:

  • The Flutter plugin powers Flutter developer workflows (running, debugging, hot reload, etc.).
  • The Dart plugin offers code analysis (code validation as you type, code completions, etc.).

When you install the Flutter plugin, if the Dart plugin is not already present IntelliJ installs it for you.

Install Flutter Plugin

After restarting, the Dart and Flutter plugins should both display in the left navigation panel when you create a new project.

3.Configuring the Flutter plugin

Flutter SDK Path

 

Now you can make your changes in IntelliJ and just run either device, or both, from IntelliJ.

Firebase JobDispatcher

If you have some heavy work to do and it doesn’t need to be done right now, then I’m going to suggest you use to use Firebase JobDispatcher.It is guaranteed to get your job done.It operates at the System level, it can use several factors to intelligently schedule your background work to run with the jobs from other apps as well, that means we can minimize things like radio use, which is a clear battery win.API 24, JobScheduler even considers memory pressure, which is a clear overall win for devices and their users.It doesn’t perform work solely based on time, but rather based on conditions.For example, you can use setRequiredNetworkType for jobs.
That you want to execute when the user is on a metered vs an unmetered connection or you can call setLifeTime for jobs that you want to persist across a potential reboot.

Getting started


Installation

Add the following to your app build.gradle‘s dependencies


 Create Job

You can define conditions when you are creating the job through the Job object.To build that Job object you need two things every time, and then the criteria are all the bonus that’s over here.But you need a job tag to help you distinguish which job this and a job service.


Create a new JobService

Your job service is actually going to be a service that extends the JobService class, and this is where you’ll define the work that you’ll be doing.You will need to implement a few required methods.

onStartJob is called by the system when it is time for your job to execute.This is where the one tricky part about JobSchedulaer exists.Your JobService runs on the main thread.So use onStartJob to either perform simple work And if you do kick off something else, you’ll need to return true.
But if you’re done with everything, go ahead and return false.This will let the system know whether your job has any ongoing work still.

onStopJob is called by the system if your job is canceled before being finished, perhaps because the conditions are no longer being met like the device has been unplugged.So use this for safety checks and clean up, and then return true if you’d like the system to reschedule the job or false if it doesn’t matter and the job will be dropped.

jobFinished is not a method you override, and the system won’t call it, but that’s because you need to be one to call this method once your service or thread has finished working on the job.That is if your onStartJob returned true because this is how the system knows that your work is really done and it can release your wake-lock.So if you forget your app is going to look pretty guilty in the battery stats line up.jobFinished take two parameters, the current job, so that it knows which one we are talking about and a boolean indicating whether you’d like to reschedule the job.Perhaps your work failed for some reason.So this will kick off the JobScheduler’s essential backoff logic for you or else the logic you specified in Job.

Now, as with any service, you’ll need to add this one to your AndroidManifest.xml. What’s different, though, is that you need to add a permission that will allow JobScheduler to call your jobs and be the only one that can access your jobService.

Finally, you can schedule a job using FirebaseJobDispatcher, which you will get from the system, then call schedule using that super perfect JobInfo object you create, and you are good to go.

Conclusion

There are a lot of pieces to be sure, and you’ll need to think carefully about when and what should trigger your job and what happens if it fails for some reason.But overall FirebaseJobDispatcher was designed to be easy to work with.So give it a try and go build batter apps.

Firebase Performance Monitoring for Android

Waiting for things to load is part of everyone’s mobile app experience.But it’s never a good experience for your user and how would you even know what the experience is? Your user is on a wide variety of devices, in a wide variety of networks, in a wide variety of locations all over the world.If you want to optimize the performance of your app, you metrics that tell you exactly what’s happening during the critical moments of your app’s use.You need that information to come directly from users.Now, You can get using Firebase Performance Monitoring.By integrating the SDK into your app, and without writing any code, your performance dashboard in the Firebase console will collect information about your app’s performance, as seen by your users.You’ll get data about your app’s startup time and details about its HTTP transactions.Using the provided API, you can instrument your app to measure those critical moments that you want to understand and improve.Then, in the dashboard, you can break down the data by Country, Version, Device, OS, Radio, Carrier, MIME type.

Install Firebase SDK in APP

This guide shows you how to use Firebase Performance Monitoring with your app.

1 Prerequisites

You need a few things set up in your environment:

  • A device running Android 4.0 +, and Google Play services 11.0.4 +
  • The Google Play services SDK from the Google Repository, available in the Android SDK Manager
  • The latest version of Android Studio 2.2+

2 Add Firebase to your Android project

Add Firebase to your app from Android Studio using the Firebase Assistant.To open the Firebase Assistant in Android Studio:

  1. Click Tools > Firebase to open the Assistant window.
  2. Click to expand one of the listed features (for example, Analytics), then click the provided tutorial link (for example, Log an Analytics event).
  3. Click the Connect to Firebase button to connect to Firebase and add the necessary code to your app.

Firebase Performance Monitor

3 Add Performance Monitoring to your app

  • Add the following dependencies project-level build.gradle :

Add the following dependencies app-level build.gradle :

  1. Below apply plugin: 'com.android.application', add the following line:
  2. Add the following to the dependencies section:
  3. If your app uses other Firebase SDKs, you should also change the version to 11.0.4 for those SDKs.

 

Recompile your app. Automatic traces and HTTP/S network requests are now monitored.

If you just install the SDK, you still get a few out-of-the box traces, the most important one being app start.This is the time between the app code start loading until the app is responsive to the user.Firebase also has a trace that monitors the foreground session so anytime the app is available to the user.and we do the same for the background as well.

Performance issues that can impact you as a developer falls under one of two buckets.The first one has to do with your app code performance.

So this is a thing like what kind of delays your user see as they interact with the app, or how much frame drops they see in the animation to monitor that Firebase built feature called Traces.

The other category of issues is about network activity between your app and the backends that it uses for that Firebase monitor network requests.

Traces

The easiest way to define a trace is its performance report between two points in your app.Imagine you have a record button and a stop button, and in between those two points you just pick up metrics, So these two points can be anything in your app, anything that you care about and you think is worth monitoring.So it could be something like the time between the user taps on one of the shoes until they see the full product details page.It could be as fine grained as a database fetch to get a piece of string and put it on the screen or it could be as long as the full checkout flow.

All you need to do is create a trace name or a trace object, give it a name, start it and stop it.That name you give it becomes your context.

Add the @AddTrace annotation to trace specific methods

You can add the @AddTrace annotation to methods in your app and provide a string to identify the resulting trace. This causes a trace to start at the beginning of this method, and to stop when the method completes. Traces created in this way do not have counters available.

Trace Metrics

In terms of metrics, by default you get the trace duration, so how much time took place between the start from the stop.So essentially, you can use it a time.But it’s actually more than that because you can attach custom metrics using counter API.which you see an example of here.So here think any example of countable events that are relevant to your performance how many times you called the local disk, how many times you called the GPS, how many times you dropped a frame, how many times you made a network call, and so on.

 

Any one of these events you can just give it a name, You can count it increment it and we’ll tally up those events and report them as custom metrics attached to your trace.

Network Monitor

An HTTP/S network request is a report that captures the time between when your app issues a request to a service endpoint and when the response from that endpoint is complete. For any endpoint that your app makes a request to, the SDK will capture several metrics:

  • Response time: Time between when the request is made and when the response is fully received
  • Payload size: Byte size of the network payload downloaded and uploaded by the app
  • Success rate: Percentage of successful responses compared to total responses (to measure network or server failures)

 

Disable the Firebase Performance Monitoring SDK

You can disable the Performance Monitoring SDK when building your app with the option to re-enable it at runtime, or build your app with Performance Monitoring enabled and then have the option to disable it at runtime using Firebase Remote Config. You can also completely deactivate Performance Monitoring, with no option to enable it at runtime.

Disable during your app build process

Disabling Performance Monitoring during your app build process could be useful is to avoid reporting performance data from a pre-release version of your app during app development and testing.

Add the following property to your app’s gradle.properties file to disable automatic traces and HTTP/S network request monitoring (but not custom traces) at build time:

Changing this property to true re-enables automatic traces and HTTP/S network request monitoring.

Allow your app to enable it at runtime, by adding a <meta-data> element to the <application> element of your app’s AndroidManifest.xml file, as follows:

To completely deactivate Performance Monitoring with no option to enable it at runtime, add a <meta-data>element to the <application> element of your app’s AndroidManifest.xml file, as follows:

 

Related Post

Phone Number Authentication with Firebase Auth

Crashlytics

Memory Leak

 

Train your Object Detection model locally with TensorFlow

In this post, we’re going to train machine learning models capable of localizing and identifying multiple objects in an image. You’ll need to install TensorFlow and you’ll need to understand how to use the command line.

Tensorflow Object Detection API

The TensorFlow Object Detection API is an open source framework built on top of TensorFlow that makes it easy to construct, train and deploy object detection models.

This post walks through the steps required to train an object detection model locally.

1.Cloning an Object Detection API repository

or you can download directly ZIP file.

2.Installation

Tensorflow Object Detection API depends on the following libraries.

  • Protobuf 2.6
  • Protobuf 2.6
  • Pillow 1.0
  • Lxml

The Tensorflow Object Detection API uses Protobufs to configure model and training parameters. Before the framework can be used, the Protobuf libraries must be compiled. This should be done by running the following command from the tensorflow/models directory:

  • Jupyter notebook

  • Matplotlib

Add Libraries to PYTHONPATH

When running locally, the tensorflow/models/ and slim directories should be appended to PYTHONPATH. This can be done by running the following from tensorflow/models/:

Note: This command needs to run from every new terminal you start. If you wish to avoid running this manually, you can add it as a new line to the end of your ~/.bashrc file.

Testing the Installation

You can test that you have correctly installed the Tensorflow Object Detection API by running the following command:

Above command generate following output.

Install Object Detection API

3.Preparing Inputs

Tensorflow Object Detection API reads data using the TFRecord file format. Two sample scripts (create_pascal_tf_record.py and create_pet_tf_record.py) are provided to convert \dataset to TFRecords.

Directory Structure for Training input data

  • To prepare the input file for the sample scripts you need to consider two things. Firstly, you need an RGB image which is encoded as jpg or png and secondly, you need a list of bounding boxes (xmin, ymin, xmax, ymax) for the image and the class of the object in the bounding box.
  • I scraped 200 pet from Google Images.Here is a subset of the pet image data set that I collected in images folder:

 

Afterward, labeled them manually with LabelImg. LabelImg is a graphical image annotation tool that is written in Python. It’s super easy to use and the annotations are saved as XML files.Save image annotations xml in /annotations/xmls folder.

Image Annotation

Create trainval.txt in annotations folder which content name of the images without extension.Use the following command to generate trainval.txt.

Label Maps

Each dataset is required to have a label map associated with it. This label map defines a mapping from string class names to integer class Ids.Label maps should always start from id 1.Create label.pbtxt file with the following label map:

Generating the Pet TFRecord files.

Run the following commands.

You should end up with two TFRecord files named    pet_train.record and pet_val.record in the tensorflow/modelsdirectory.

4.Training the model

After creating the required input file for the API, Now you can train your model.For training, you need the following command:

An object detection training pipeline also provide sample config files on the repo. For my training, I used ssd_mobilenet_v1_pets.config basis. I needed to adjust the num_classes to one and also set the path (PATH_TO_BE_CONFIGURED) for the model checkpoint, the train, and test data files as well as the label map. In terms of other configurations like the learning rate, batch size and many more, I used their default settings.

Running the Evaluation Job

Evaluation is run as a separate job. The eval job will periodically poll the train directory for new checkpoints and evaluate them on a test dataset. The job can be run using the following command:

where ${PATH_TO_YOUR_PIPELINE_CONFIG} points to the pipeline config, ${PATH_TO_TRAIN_DIR} points to the directory in which training checkpoints were saved (same as the training job) and ${PATH_TO_EVAL_DIR} points to the directory in which evaluation events will be saved. As with the training job, the eval job run until terminated by default.

Running Tensorboard

Progress for training and eval jobs can be inspected using Tensorboard. If using the recommended directory structure, Tensorboard can be run using the following command:

where ${PATH_TO_MODEL_DIRECTORY} points to the directory that contains the train and eval directories. Please note it may take Tensorboard a couple minutes to populate with data.

5.Exporting the Tensorflow Graph

After your model has been trained, you should export it to a Tensorflow graph proto. First, you need to identify a candidate checkpoint to export. The checkpoint will typically consist of three files in pet folder:

  1.  model.ckpt-${CHECKPOINT_NUMBER}.data-00000-of-00001
  2. model.ckpt-${CHECKPOINT_NUMBER}.index
  3. model.ckpt-${CHECKPOINT_NUMBER}.meta

Run the following command to export Tensorflow graph.Change the check point number.

Related Post

Introduction TensorFlow Machine Learning Library

TenserFlow Lite

Train Image classifier with TensorFlow

Android TensorFlow Machine Learning