TF Function and Autograph(Part II)

Prince Canuma
6 min readMar 27, 2019

--

Taking one step at a time in the 1000 mile journey is to improve and evolve. In this article we are going to take a look at some of TensorFlow 2.0 alpha newest features presented at the TF Dev Summit ’19.

TensorFlow(TF) was known for being very hard AI/ML framework to get started into AI with, TF Core to be more specific — it really felt like you had to do a lot get started and had to pull some amazing stunts whenever debugging the code.

“Programming is the art of algorithm design and the craft of debugging errant code.”— Ellen Ullman

TF was a still a newborn baby that Google wanted it to be perfect, which was what made them less efficient compared to its evil twin Pytorch which literally torched TF’s reputation, first amongst the AI researchers community then throughout the industry. TF was born with too many features they weren’t needed, because python already offers a great base to build of, some of them that didn’t make sense, like “session.run” for example.

Like all, TF’s team is composed of humans, I would dare to say very intelligent humans, yet you can escape the natural course of evolution, it took some time and maybe a few years — a bit of hiring here and firing there, but the version 2.0 is finally here, new and improved.

“Building technical systems involves a lot of hard work and specialized knowledge: languages and protocols, coding and debugging, testing and refactoring.” — Jesse James Garrett

They really stepped it up, first with Tensorflow Lite which I have an amazing article about it and today I’m going to give you an intersting view over some of other the changes they made.

TF function

A new way of writing TF code

This is a tool made and inteded to help us create graphs automatically out of our programs, also gives us peak performance and make our AI/ML model easily deployable to any device. The cherry on top of the cake is that TF dev team is claiming that you get a 10x speed.

The above explaination on its own doesn’t say much, let’s brake it down and try to understand the essence of it.

  • First what are graphs?

Tensorflow uses a dataflow graph which represent all the operations our model is going to perform and all the dependencies between individual operations — this is an example of simple graph for better understanding:

In a dataflow graph, the nodes represent units of computation, and the edges represent the data consumed or produced by a computation.

  • What is dataflow?

Dataflow is a programming paradigm that models a program as directed graph of data flowing between operations, this paradigm is common for parallel computing and distributed computing (includes training and execution) mainly used in numerical processing frameworks such as TF.

Furthermore using dataflow makes it easy for:

  • Parallelism. By using edges to represent dependencies between operations, it is easy for the system to identify operations that can execute in parallel.
  • Distributed execution. By using edges to represent the values that flow between operations, it is possible for TensorFlow to partition your program across multiple devices (CPUs, GPUs, and TPUs) attached to different machines. TensorFlow inserts the necessary communication and coordination between devices.
  • Compilation. TensorFlow’s XLA compiler can use the information in your dataflow graph to generate faster code, for example, by fusing together adjacent operations.
  • Portability. The dataflow graph is a language-independent representation of the code in your model. You can build a dataflow graph in Python, store it in a SavedModel, and restore it in a C++ program for low-latency inference.

Now, how do I use tf function to help me build this amazing thing called graphs that literally make AI easily available for the masses?

Note: I assume you have some knowledge of python programming.

Here is an example of a TF function:

Now, this looks gorgeous, in fact what’s happening in the background is that the @tf.function is creating the equivalent graph code for us automatically. This was a must have feature for a long time now and finally its here, because previously we had to built the graph ourself and on top of that needed a lot of boilerplate code, it was a nightmare worse than dealing with freddy krueger.

Now you can have the flexibility and modularity of python functions within TF, you just have to decorate it with @tf.function.

TF function is a nifty tool, it understands the order of which we wanted our code to execute(an advantage from building of python), which then it can optimize variable creation and utilization.

With this, here are some even better news for those of you who new and worked with TF 1.x before reading this article:

  • No more Session.run to run your code, you can just run it like normal python code.
  • No more tf.global_variables_initializer.

Autograph

The hidden hero that acts in the dark

Autograph is a library that fully and deeply integrated with @tf.function and it will rewrite conditionals and loops(if, while, for, break, continue, etc..) which depend on Tensors to run dynamically in the graph.

Simplifying it we can say a tensor is any matrix with N ≥ 3 dimensions. Tensors are at the beating heart of every Deep learning library out there.

In TF all tensors ops are represented with a tf.operation_name. What autograph does is that ot takes all tf.functions that have a conditionals and/or loops(control flow) which use a tensor variable, operation or depend on it, and converts/rewrites them into graph ready code(low-level TF code). The control flow is only changed when it predicates to tensors.

Here is a great example of it directly from the TF docs:

I’m not responsable for any damage to your brain if you choose to inspect the code generated.

If you never experience TF 1.x you might not fully understand what I’m talking about, but tell you this life as a AI software company, engineer and reseacher has been made easier and better with such improvements from the best AI for production framework out there.

TF 2.0 is in beta now but please go to their website and give it a try, it has progressed a lot and its looking gorgeous.

Summary:

Things to remember:

  • TF function lets you leverage normal python function and all of its features, on top of that it creates the dataflow graph automatically for you, which is needed to execute operations in parallel, distribute operation loads to(CPUs, GPUs or TPUs),
  • TF function also improves the speed with which your functions execute by a factor of 10 and with the dataflow graph you can run your model in any device.
  • Autograph is a library deeply integrated with TF function that takes converts all you control flow code(if, while, for, etc…) that use or depend on any tensor into a graph for you, by rewriting everything in graph optimized code so it can run dynamically in the graph.

All of this are improvements that the TF community longed for, even I myself wanted this change because before you had to write a lot of boiler plate code which didn’t make much sense for me, I learn python to get started into machine learning and after TF 1.x i felt like I had to forget what I learned in python and learn TF. G

This article is part of a series where I will be sharing the highlights of the TF Dev Summit ’19 and surprise spin-off article that I will share a new AI race.

Thank you for reading. If you have any thoughts, comments or critics please comment down below.

Follow me on twitter at Prince Canuma, so you can always be up to date with the AI field.

If you like it and relate to it, please give me a round of applause 👏👏 👏(+50) and share it with your friends.

--

--

Prince Canuma
Prince Canuma

Written by Prince Canuma

Helping research & production teams achieve MLOps success | Ex-@neptune_ai

No responses yet