pdfcoffee
To know the value, we need to create the session graph and explicitly use the runcommand with the desired tensor values as done with the following code:Chapter 2print(sess.run(t_1)) # Will print the value of t_1 defined in step 1An example of TensorFlow 1.x inTensorFlow 2.xWe can see that the TensorFlow 1.x APIs offer flexible ways to create and manipulatecomputational graphs representing (deep) neural networks and many other typesof machine learning programs. TensorFlow 2.x on the other hand offers higher-levelAPIs that abstract more lower-level details. To conclude, let's go back to an exampleof the TensorFlow 1.x program that we encountered in the previous chapter. Here,we also add a line to show the computational graph:import tensorflow.compat.v1 as tftf.disable_v2_behavior()in_a = tf.placeholder(dtype=tf.float32, shape=(2))def model(x):with tf.variable_scope("matmul"):W = tf.get_variable("W", initializer=tf.ones(shape=(2,2)))b = tf.get_variable("b", initializer=tf.zeros(shape=(2)))return x * W + bout_a = model(in_a)with tf.Session() as sess:sess.run(tf.global_variables_initializer())outs = sess.run([out_a],feed_dict={in_a: [1, 0]})writer = tf.summary.FileWriter("./logs/example", sess.graph)Please note that the syntax x * W + b is just the linear perceptron defined in theprevious chapter. Now let's start a visualization application called "TensorBoard"to show the computational graph:tensorboard --logdir=./logs/example/And let's open a browser pointing to http://localhost:6006/#graphs&run=.[ 59 ]
TensorFlow 1.x and 2.xYou should see something similar to the following graph:Figure 1: An example of a computational graphThis section provided an overview of TensorFlow 1.x programming paradigms.Now, let's turn our attention to what is new in TensorFlow 2.x.Understanding TensorFlow 2.xAs discussed, TensorFlow 2.x recommends using a high-level API such as tf.keras,but leaves low-level APIs typical of TensorFlow 1.x for when there is a need to havemore control on internal details. tf.keras and TensorFlow 2.x come with some greatbenefits. Let's review them.Eager executionTensorFlow 1.x defines static computational graphs. This type of declarativeprogramming might be confusing for many people. However, Python is typicallymore dynamic. So, following the Python spirit, PyTorch, another popular deeplearning package, defines things in a more imperative and dynamic way: youstill have a graph, but you can define, change, and execute nodes on-the-fly, withno special session interfaces or placeholders. This is what is called eager execution,meaning that the model definitions are dynamic, and the execution is immediate.Graphs and sessions should be considered as implementation details.[ 60 ]
- Page 43 and 44: Neural Network Foundations with Ten
- Page 45 and 46: Neural Network Foundations with Ten
- Page 47 and 48: Neural Network Foundations with Ten
- Page 49 and 50: Neural Network Foundations with Ten
- Page 51 and 52: Neural Network Foundations with Ten
- Page 53 and 54: Neural Network Foundations with Ten
- Page 55 and 56: Neural Network Foundations with Ten
- Page 57 and 58: Neural Network Foundations with Ten
- Page 59 and 60: Neural Network Foundations with Ten
- Page 61 and 62: Neural Network Foundations with Ten
- Page 63 and 64: Neural Network Foundations with Ten
- Page 65 and 66: Neural Network Foundations with Ten
- Page 67 and 68: Neural Network Foundations with Ten
- Page 69 and 70: Neural Network Foundations with Ten
- Page 71 and 72: Neural Network Foundations with Ten
- Page 73 and 74: Neural Network Foundations with Ten
- Page 75 and 76: Neural Network Foundations with Ten
- Page 77 and 78: Neural Network Foundations with Ten
- Page 79 and 80: Neural Network Foundations with Ten
- Page 81 and 82: Neural Network Foundations with Ten
- Page 83 and 84: Neural Network Foundations with Ten
- Page 86 and 87: TensorFlow 1.x and 2.xThe intent of
- Page 88 and 89: An example to start withWe'll consi
- Page 90 and 91: Chapter 23. Placeholders: Placehold
- Page 92 and 93: • To create random values from a
- Page 96 and 97: Chapter 2Both PyTorch and TensorFlo
- Page 98 and 99: Chapter 2state = [tf.zeros([100, 10
- Page 100 and 101: Chapter 2For now, there's no need t
- Page 102 and 103: Chapter 2Let's see an example of a
- Page 104 and 105: Chapter 2If you want to save a mode
- Page 106 and 107: Chapter 2supervised=True)train_data
- Page 108 and 109: Chapter 2There, tf.feature_column.n
- Page 110 and 111: Chapter 2print (dz_dx)print (dy_dx)
- Page 112 and 113: Chapter 2In our toy example we use
- Page 114 and 115: Chapter 2For multi-machine training
- Page 116 and 117: Chapter 25. Use tf.layers modules t
- Page 118 and 119: Chapter 2Keras or tf.keras?Another
- Page 120: • tf.data can be used to load mod
- Page 123 and 124: RegressionLet us imagine a simpler
- Page 125 and 126: RegressionTake a look at the last t
- Page 127 and 128: Regression3. Now, we calculate the
- Page 129 and 130: RegressionIn the next section we wi
- Page 131 and 132: Regression2. Now, we define the fea
- Page 133 and 134: Regression2. Download the dataset:(
- Page 135 and 136: RegressionThe following is the Tens
- Page 137 and 138: RegressionIn regression the aim is
- Page 139 and 140: RegressionThe Estimator outputs the
- Page 141 and 142: RegressionThe following is the grap
- Page 143 and 144: RegressionReferencesHere are some g
To know the value, we need to create the session graph and explicitly use the run
command with the desired tensor values as done with the following code:
Chapter 2
print(sess.run(t_1)) # Will print the value of t_1 defined in step 1
An example of TensorFlow 1.x in
TensorFlow 2.x
We can see that the TensorFlow 1.x APIs offer flexible ways to create and manipulate
computational graphs representing (deep) neural networks and many other types
of machine learning programs. TensorFlow 2.x on the other hand offers higher-level
APIs that abstract more lower-level details. To conclude, let's go back to an example
of the TensorFlow 1.x program that we encountered in the previous chapter. Here,
we also add a line to show the computational graph:
import tensorflow.compat.v1 as tf
tf.disable_v2_behavior()
in_a = tf.placeholder(dtype=tf.float32, shape=(2))
def model(x):
with tf.variable_scope("matmul"):
W = tf.get_variable("W", initializer=tf.ones(shape=(2,2)))
b = tf.get_variable("b", initializer=tf.zeros(shape=(2)))
return x * W + b
out_a = model(in_a)
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
outs = sess.run([out_a],
feed_dict={in_a: [1, 0]})
writer = tf.summary.FileWriter("./logs/example", sess.graph)
Please note that the syntax x * W + b is just the linear perceptron defined in the
previous chapter. Now let's start a visualization application called "TensorBoard"
to show the computational graph:
tensorboard --logdir=./logs/example/
And let's open a browser pointing to http://localhost:6006/#graphs&run=.
[ 59 ]