Advanced Deep Learning with Keras
def encoder_layer(inputs,filters=16,kernel_size=3,strides=2,activation='relu',instance_norm=True):"""Builds a generic encoder layer made of Conv2D-IN-LeakyReLUIN is optional, LeakyReLU may be replaced by ReLU"""conv = Conv2D(filters=filters,kernel_size=kernel_size,strides=strides,padding='same')x = inputsif instance_norm:x = InstanceNormalization()(x)if activation == 'relu':x = Activation('relu')(x)else:x = LeakyReLU(alpha=0.2)(x)x = conv(x)return xWhenever possible, docstring is included. At the very least, text comment is usedto minimize space usage.Any command-line code execution is written as follows:$ python3 dcgan-mnist-4.2.1.pyPrefaceThe example code file naming is: algorithm-dataset-chapter.section.number.py. The command-line example is DCGAN on MNIST dataset in Chapter 4, secondsection and first listing. In some cases, the explicit command line to execute is notwritten but it is assumed to be:$ python3 name-of-the-file-in-listingThe file name of the code example is included in the Listing caption.[ ix ]
PrefaceGet in touchFeedback from our readers is always welcome.General feedback: Email feedback@packtpub.com, and mention the book's titlein the subject of your message. If you have questions about any aspect of this book,please email us at questions@packtpub.com.Errata: Although we have taken every care to ensure the accuracy of our content,mistakes do happen. If you have found a mistake in this book we would be gratefulif you would report this to us. Please visit, http://www.packtpub.com/submiterrata,selecting your book, clicking on the Errata Submission Form link, andentering the details.Piracy: If you come across any illegal copies of our works in any form on theInternet, we would be grateful if you would provide us with the location addressor website name. Please contact us at copyright@packtpub.com with a link to thematerial.If you are interested in becoming an author: If there is a topic that you haveexpertise in and you are interested in either writing or contributing to a book,please visit http://authors.packtpub.com.ReviewsPlease leave a review. Once you have read and used this book, why not leavea review on the site that you purchased it from? Potential readers can then see anduse your unbiased opinion to make purchase decisions, we at Packt can understandwhat you think about our products, and our authors can see your feedback on theirbook. Thank you!For more information about Packt, please visit packtpub.com.[ x ]
- Page 2 and 3: Advanced Deep Learningwith KerasApp
- Page 4 and 5: mapt.ioMapt is an online digital li
- Page 6 and 7: I would like to thank my family, Ch
- Page 8 and 9: Table of ContentsPrefaceVChapter 1:
- Page 10 and 11: [ iii ]Table of ContentsChapter 7:
- Page 12 and 13: [ v ]PrefaceIn recent years, deep l
- Page 14 and 15: Chapter 5, Improved GANs, covers al
- Page 18 and 19: Introducing Advanced DeepLearning w
- Page 20 and 21: Chapter 1Installing Keras and Tenso
- Page 22 and 23: Chapter 1• RNNs: Recurrent neural
- Page 24 and 25: [ 7 ]Chapter 1In the preceding figu
- Page 26 and 27: Chapter 1Figure 1.3.3: MLP MNIST di
- Page 28 and 29: Chapter 1model.add(Activation('soft
- Page 30 and 31: Chapter 1model.add(Activation('relu
- Page 32 and 33: Chapter 1As an example, l2 weight r
- Page 34 and 35: [ 17 ]Chapter 1How far the predicte
- Page 36 and 37: Chapter 1Figure 1.3.8: Plot of a fu
- Page 38 and 39: Chapter 1The highest test accuracy
- Page 40 and 41: Chapter 1Figure 1.3.9: The graphica
- Page 42 and 43: Chapter 1# image is processed as is
- Page 44 and 45: Chapter 1The computation involved i
- Page 46 and 47: Chapter 1Listing 1.4.2 shows a summ
- Page 48 and 49: Chapter 164-64-64 RMSprop Dropout(0
- Page 50 and 51: Chapter 1There are the two main dif
- Page 52 and 53: Chapter 1Layers Optimizer Regulariz
- Page 54: ConclusionThis chapter provided an
- Page 57 and 58: Deep Neural NetworksWhile this chap
- Page 59 and 60: Deep Neural Networks# reshape and n
- Page 61 and 62: Deep Neural NetworksEverything else
- Page 63 and 64: Deep Neural Networksfrom keras.util
- Page 65 and 66: Deep Neural NetworksFigure 2.1.3: T
def encoder_layer(inputs,
filters=16,
kernel_size=3,
strides=2,
activation='relu',
instance_norm=True):
"""Builds a generic encoder layer made of Conv2D-IN-LeakyReLU
IN is optional, LeakyReLU may be replaced by ReLU
"""
conv = Conv2D(filters=filters,
kernel_size=kernel_size,
strides=strides,
padding='same')
x = inputs
if instance_norm:
x = InstanceNormalization()(x)
if activation == 'relu':
x = Activation('relu')(x)
else:
x = LeakyReLU(alpha=0.2)(x)
x = conv(x)
return x
Whenever possible, docstring is included. At the very least, text comment is used
to minimize space usage.
Any command-line code execution is written as follows:
$ python3 dcgan-mnist-4.2.1.py
Preface
The example code file naming is: algorithm-dataset-chapter.section.number.
py. The command-line example is DCGAN on MNIST dataset in Chapter 4, second
section and first listing. In some cases, the explicit command line to execute is not
written but it is assumed to be:
$ python3 name-of-the-file-in-listing
The file name of the code example is included in the Listing caption.
[ ix ]