09.05.2023 Views

pdfcoffee

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Chapter 2

Both PyTorch and TensorFlow 2 styles are inherited from Chainer, another

"Powerful, Flexible, and Intuitive Framework for Neural Networks" (see https://

chainer.org/).

The good news is that TensorFlow 2.x natively supports "eager execution."

There is no longer the need to first statically define a computational graph and

then execute it (unless you really wanted to!). All the models can be dynamically

defined and immediately executed. Further good news is that all of the tf.keras

APIs are compatible with eager execution. Transparently, TensorFlow 2.x

creates a bridge between core TensorFlow communities, PyTorch communities,

and Keras communities, taking the best of each of them.

AutoGraph

Even more good news is that TensorFlow 2.0 natively supports imperative Python

code, including control flow such as if-while, print() and other Python-native

features, and can natively convert it into pure TensorFlow graph code. Why is this

useful? Python coding is very intuitive and there are generations of programmers

used to imperative programming, but would struggle to convert this code into

a graph format that is typically much faster and allows for automatic optimization.

This is when AutoGraph comes into play: AutoGraph takes eager-style Python code

and automatically converts it to graph-generating code. So, again, transparently

TensorFlow 2.x creates a bridge between imperative, dynamic, and eager Pythonstyle

programming with efficient graph computations, taking the best of both worlds.

Using AutoGraph is extremely easy: the only thing that you need to do is to annotate

your Python code with the special decorator tf.function as in the following code

example:

import tensorflow as tf

def linear_layer(x):

return 3 * x + 2

@tf.function

def simple_nn(x):

return tf.nn.relu(linear_layer(x))

def simple_function(x):

return 3*x

If we inspect simple_nn, we see that it is a special handler for interacting with

TensorFlow internals, while simple_function is a normal Python handler:

>>> simple_nn

<tensorflow.python.eager.def_function.Function object at 0x10964f9b0>

[ 61 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!