pdfcoffee

soumyasankar99
from soumyasankar99 More from this publisher
09.05.2023 Views

Chapter 2Keras or tf.keras?Another legitimate question is whether you should use Keras with TensorFlow as abackend or, instead, use the APIs in tf.keras directly available in TensorFlow. Notethat there is not a 1:1 correspondence between Keras and tf.keras. Many endpointsin tf.keras are not implemented in Keras and tf.Keras does not support multiplebackends as Keras. So, Keras or tf.keras? My suggestion is the second optionrather than the first one. tf.keras has multiple advantages over Keras, consistingof TensorFlow enhancements discussed in this chapter (eager execution; nativesupport for distributed training, including training on TPUs; and support for theTensorFlow SavedModel exchange format). However, the first option is still themost relevant one if you plan to write highly portable code that can run on multiplebackends, including Google TensorFlow, Microsoft CNTK, Amazon MXnet, andTheano. Note that Keras is an independent open source project, and its developmentis not dependent on TensorFlow. Therefore, Keras is going to be developed for theforeseeable future. Note that Keras 2.3.0 (released on September 17, 2019) is thefirst release of multi-backend Keras, which supports TensorFlow 2.0. It maintainscompatibility with TensorFlow 1.14 and 1.13, as well as Theano and CNTK.Let's conclude the chapter with a new comparison: the primary machine learningsoftware tools used by the top-5 teams on Kaggle in each competition. This was asurvey ran by Francois Chollet on Twitter at the beginning of April 2019 (thanks,Francois for agreeing to have it included in this book!):Figure 5: Primary ML software tools used by top-5 teams on Kaggle in 2019[ 83 ]

TensorFlow 1.x and 2.xIn this section, we have seen the main differences between Keras and tf.keras.SummaryTensorFlow 2.0 is a rich development ecosystem composed of two main parts:Training and Serving. Training consists of a set of libraries for dealing with datasets(tf.data), a set of libraries for building models, including high-level libraries (tf.Keras and Estimators), low-level libraries (tf.*), and a collection of pretrainedmodels (tf.Hub), which will be discussed in Chapter 5, Advanced Convolutional NeuralNetworks. Training can happen on CPUs, GPUs, and TPUs via distribution strategiesand the result can be saved using the appropriate libraries. Serving can happenon multiple platforms, including on-prem, cloud, Android, iOS, Raspberry Pi, anybrowser supporting JavaScript, and Node.js. Many language bindings are supported,including Python, C, C#, Java, Swift, R, and others. The following diagramsummarizes the architecture of TensorFlow 2.0 as discussed in this chapter:Figure 6: Summary of TensorFlow 2.0 architecture[ 84 ]

Chapter 2

Keras or tf.keras?

Another legitimate question is whether you should use Keras with TensorFlow as a

backend or, instead, use the APIs in tf.keras directly available in TensorFlow. Note

that there is not a 1:1 correspondence between Keras and tf.keras. Many endpoints

in tf.keras are not implemented in Keras and tf.Keras does not support multiple

backends as Keras. So, Keras or tf.keras? My suggestion is the second option

rather than the first one. tf.keras has multiple advantages over Keras, consisting

of TensorFlow enhancements discussed in this chapter (eager execution; native

support for distributed training, including training on TPUs; and support for the

TensorFlow SavedModel exchange format). However, the first option is still the

most relevant one if you plan to write highly portable code that can run on multiple

backends, including Google TensorFlow, Microsoft CNTK, Amazon MXnet, and

Theano. Note that Keras is an independent open source project, and its development

is not dependent on TensorFlow. Therefore, Keras is going to be developed for the

foreseeable future. Note that Keras 2.3.0 (released on September 17, 2019) is the

first release of multi-backend Keras, which supports TensorFlow 2.0. It maintains

compatibility with TensorFlow 1.14 and 1.13, as well as Theano and CNTK.

Let's conclude the chapter with a new comparison: the primary machine learning

software tools used by the top-5 teams on Kaggle in each competition. This was a

survey ran by Francois Chollet on Twitter at the beginning of April 2019 (thanks,

Francois for agreeing to have it included in this book!):

Figure 5: Primary ML software tools used by top-5 teams on Kaggle in 2019

[ 83 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!