Eager Execution:Pythonic way of using TensorFlow

When you enable Eager execution in TensorFlow operations are executed immediately as they are called from Python.

Eager Execution is an imperative object-oriented pythonic way of using TensorFlow. TensorFlow is graph execution engine for machine learning.

Why Graph Execution?


A really good reason is your computation represented as a platform independent.The graph is that once you have that it’s very easy to Automatic differentiation that graph.

If you have a platform independent abstract representation of your computation you can just go and deploy it to pretty much anything you want. You can run it on the TPU you can run on a GPU you can put it on a phone, Raspberry PI like all sorts of cool deployment scenarios.It’s really valuable to have this kind of platform independent view

The compilers work with data for graphs internally and they know how to do all sorts of nice optimizations that rely on having a global view of computation like constant folding common subexpression elimination and data laying thing like that.

A lot of these optimizations are really like deep learning specific.We can choose how to properly layout your channels and your height and width, so your convolutions are faster.

A key reason that’s very important is once you have a platform independent representation of your computation, You can just deploy it and distribute it across hundreds of machines or an TPU.

Why Eager Execution


These graphs are so good what made us to think that now it’s a good idea to move beyond them and let you do Eager Execution.

You can just build up a trace as you go and then walk back the trace to compute gradients.

You can iterate a lot more quickly you can play with your model as you build it.

You can inspect it you can poke and prod in it and this can let you just be more productive when you’re like making all these changes.

You can run your model for debuggers and profilers and add all sorts of like analysis tools to them to just really understand how they’re doing what they’re doing.

If you don’t force you to represent you computation in a separate way then the host programming language you’re using you can just use ultimate like machinery of your host programming language to do control flow and data flow complicated data structures which for some models is key to being able to make your model working at all.

Enable Eager Execution


You import tensorflow and you call tf.enable_eager_execution() and once you do that what happens is anytime you run a TensorFlow operation like in this case it runs immediately instead of building a graph. That later runs when executed is going to run that matrix multiplication. TensorFlow immediately runs that matrix multiplication for you and give you the result and you can print it you can slice it dice it you can do whatever you want with it.

Control Flow

Because thing happening immediately you can have highly dynamic control flow that depends on the actual values of the computation you’re executing and here is just simple like if conditions example.It doesn’t matter it just matters it has like while loops that depend on like complicated values are computed based on the computation and this runs just fine on whatever device you have.

TensorFlow also brings you a few symbols that make it easier for you to write code that’s going to work with both when building graphs you know executing eagerly.

Gradients

Different operations can occur during each call, TensorFlow record all forward operations to a tape, which is then played backward when computing gradients. After it computed the gradients, it discards the tape

The gradients_function call takes a Python function square() as an argument and returns a Python callable that computes the partial derivatives of square() with respect to its inputs. So, to get the derivative of square() at 10.0, invoke grad(10.), which is 20.

Loops

Also writing loops in Eager is very easy and straightforward you can just use a Python for loop to iterate over your datasets and datasets work in eager just fine and they work the same high performance you get in the graph execution engine then you can just do your predictions compute your gradient supply your gradients all the things you’re to doing.

Debugging

When Eager Execution is enabled you can just take any model code add notes to like drop into the Python debugger anywhere you want. Once you’re in the Python debugger you have the full power of debugging available you can print the value of anything.You can change the value of any tensor you can run any operation you want on any tensor and this will hopefully empower you to really understand what’s going on in your models and really be able to fix any problems you have.You can also take Eager Execution code and profile it using whatever profiling tool.

Variable is Object


A big change when programming with Eager from the graph that variables intensive though is usually a complicated thing to think about but when eager execution is enabled it’s much simpler.The TensorFlow variable is just the python object.You create one you have it, you can write, you can change its value, you can read this value when the last reference to it goes away, you get your memory back even if it’s a GPU memory. So if you want to share variables you just reuse those object you don’t worry about variable scopes or any other complicated structure and because TensorFlow has this like object-oriented approach to variables it can look at some of the API is intensively flowing like rethink them in a way that’s a little more.

Object-oriented Saving


TensorFlow also giving you a way to do object-oriented saving of TensorFlow models. If you’ve tried looking at TensorFlow checkpoints you know that they depend on variable names and variable names depend not just on a name you, but on all other which are present in your graph.This can make it a little hard for you to save and load subsets of your model and really control what’s in your checkpoint. TensorFlow introducing a completely object-oriented python object based saving API.Any variable that’s reachable from your model gets saved on your model.You can save any subset of your model, you can load any subset of your model, you can even use this.

Conclusion

With Eager TensorFlow bringing you a lot of new that make it easier for you to built TensorFlow graph and to execute models.These are compatible with both eager execution and graph building.

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *