pdfcoffee
• tf.data can be used to load models in a very efficient way.Chapter 2• tf.keras and Estimators are high-level libraries where the power ofTensorFlow 1.x is still accessible via tf.* lower-level libraries. tf.kerassupports eager computation while still retaining the performance of lowerlevelcomputational graphs via tf.function. tf.hub is a nice collection ofpretrained models that can be used immediately.• Distribution Strategies allow training to be run on CPUs, GPUs, and TPUs.• SavedModel can be served on multiple platforms.In this chapter we have discussed the main differences between TensorFlow 1.xand 2.x and reviewed the powerful new features available in 2.x. The key topicsdiscussed in this chapter were: the computational graph in TensorFlow 1.x, and theadvantages of TensorFlow 2.x, such as support for eager execution, distribution, andTPU training. The next chapter will introduce Regression a quite powerful tool formathematical modelling, classification and prediction.[ 85 ]
- Page 69 and 70: Neural Network Foundations with Ten
- Page 71 and 72: Neural Network Foundations with Ten
- Page 73 and 74: Neural Network Foundations with Ten
- Page 75 and 76: Neural Network Foundations with Ten
- Page 77 and 78: Neural Network Foundations with Ten
- Page 79 and 80: Neural Network Foundations with Ten
- Page 81 and 82: Neural Network Foundations with Ten
- Page 83 and 84: Neural Network Foundations with Ten
- Page 86 and 87: TensorFlow 1.x and 2.xThe intent of
- Page 88 and 89: An example to start withWe'll consi
- Page 90 and 91: Chapter 23. Placeholders: Placehold
- Page 92 and 93: • To create random values from a
- Page 94 and 95: To know the value, we need to creat
- Page 96 and 97: Chapter 2Both PyTorch and TensorFlo
- Page 98 and 99: Chapter 2state = [tf.zeros([100, 10
- Page 100 and 101: Chapter 2For now, there's no need t
- Page 102 and 103: Chapter 2Let's see an example of a
- Page 104 and 105: Chapter 2If you want to save a mode
- Page 106 and 107: Chapter 2supervised=True)train_data
- Page 108 and 109: Chapter 2There, tf.feature_column.n
- Page 110 and 111: Chapter 2print (dz_dx)print (dy_dx)
- Page 112 and 113: Chapter 2In our toy example we use
- Page 114 and 115: Chapter 2For multi-machine training
- Page 116 and 117: Chapter 25. Use tf.layers modules t
- Page 118 and 119: Chapter 2Keras or tf.keras?Another
- Page 122 and 123: RegressionRegression is one of the
- Page 124 and 125: Chapter 3Simple linear regressionIf
- Page 126 and 127: Chapter 3Where YY̅ and AA̅ are th
- Page 128 and 129: Chapter 3Multiple linear regression
- Page 130 and 131: Chapter 3• crossed_column: When w
- Page 132 and 133: Chapter 3Predicting house price usi
- Page 134 and 135: Chapter 36. Next we instantiate a L
- Page 136 and 137: Chapter 3The graph shows the flow o
- Page 138 and 139: Chapter 3Logistic regression is use
- Page 140 and 141: Chapter 34. Use the feature_column
- Page 142 and 143: Chapter 3One can also use TensorBoa
- Page 144 and 145: ConvolutionalNeural NetworksIn the
- Page 146 and 147: In fact, it is possible to slide th
- Page 148 and 149: Chapter 4Pooling layersLet's suppos
- Page 150 and 151: Chapter 4Where pool_size=(2, 2) is
- Page 152 and 153: Chapter 4X_train, X_test = X_train
- Page 154 and 155: Chapter 4Figure 7: Execution of the
- Page 156 and 157: Chapter 4Figure 10: Accuracy for di
- Page 158 and 159: Chapter 4Figure 12: An example of C
- Page 160 and 161: Chapter 4Let's run the code. Our ne
- Page 162 and 163: Chapter 4Congratulations! You have
- Page 164 and 165: Chapter 4Figure 15: An example of i
- Page 166 and 167: Since we saved the model and the we
- Page 168 and 169: Chapter 4model.add(layers.Convoluti
• tf.data can be used to load models in a very efficient way.
Chapter 2
• tf.keras and Estimators are high-level libraries where the power of
TensorFlow 1.x is still accessible via tf.* lower-level libraries. tf.keras
supports eager computation while still retaining the performance of lowerlevel
computational graphs via tf.function. tf.hub is a nice collection of
pretrained models that can be used immediately.
• Distribution Strategies allow training to be run on CPUs, GPUs, and TPUs.
• SavedModel can be served on multiple platforms.
In this chapter we have discussed the main differences between TensorFlow 1.x
and 2.x and reviewed the powerful new features available in 2.x. The key topics
discussed in this chapter were: the computational graph in TensorFlow 1.x, and the
advantages of TensorFlow 2.x, such as support for eager execution, distribution, and
TPU training. The next chapter will introduce Regression a quite powerful tool for
mathematical modelling, classification and prediction.
[ 85 ]