09.05.2023 Views

pdfcoffee

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

TensorFlow 1.x and 2.x

Computational graphs

A computational graph is a network of nodes and edges. In this section, all the data

to be used – that is, tensor objects (constants, variables, placeholders) – and all the

computations to be performed – that is, operation objects – are defined. Each node

can have zero or more inputs but only one output. Nodes in the network represent

objects (tensors and operations), and edges represent the tensors that flow between

operations. The computational graph defines the blueprint of the neural network, but

the tensors in it have no "value" associated with them yet.

A placeholder is simply a variable that we will assign data to at

a later time. It allows us to create our computational graph, without

needing the data.

To build a computational graph, we define all the constants, variables, and

operations that we need to perform. In the following sections we describe the

structure using a simple example of defining and executing a graph to add two

vectors.

Execution of the graph

The execution of the graph is performed using the session object, which encapsulates

the environment in which tensor and operation objects are evaluated. This is the

place where actual calculations and transfers of information from one layer to

another take place. The values of different tensor objects are initialized, accessed,

and saved in a session object only. Until this point, the tensor objects were just

abstract definitions. Here, they come to life.

Why do we use graphs at all?

There are multiple reasons as to why we use graphs. First of all, they are a natural

metaphor for describing (deep) networks. Secondly, graphs can be automatically

optimized by removing common sub-expressions, by fusing kernels, and by cutting

redundant expressions. Thirdly, graphs can be distributed easily during training,

and be deployed to different environments such as CPUs, GPUs, or TPUs, and

also the likes of cloud, IoT, mobile, or traditional servers. After all, computational

graphs are a common concept if you are familiar with functional programming, seen

as compositions of simple primitives (as is common in functional programming).

TensorFlow borrowed many concepts from computational graphs, and internally

it performs several optimizations on our behalf.

[ 52 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!