16.03.2021 Views

Advanced Deep Learning with Keras

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Chapter 2

Second, although both branches have the same kernel size of 3, the right branch

use a dilation rate of 2. Figure 2.1.2 shows the effect of different dilation rates on a

kernel with size 3. The idea is that by increasing the coverage of the kernel using

dilation rate, the CNN will enable the right branch to learn different feature maps.

We'll use the option padding='same' to ensure that we will not have negative tensor

dimensions when the dilated CNN is used. By using padding='same', we'll keep the

dimensions of the input the same as the output feature maps. This is accomplished

by padding the input with zeros to make sure that the output has the same size:

Figure 2.1.2: By increasing the dilate rate from 1, the effective kernel coverage also increases

Following listing shows the implementation of Y-Network. The two branches

are created by the two for loops. Both branches expect the same input shape. The

two for loops will create two 3-layer stacks of Conv2D-Dropout-MaxPooling2D.

While we used the concatenate layer to combine the outputs of the left and right

branches, we could also utilize the other merge functions of Keras, such as add,

dot, multiply. The choice of the merge function is not purely arbitrary but must

be based on a sound model design decision.

In the Y-Network, concatenate will not discard any portion of the feature maps.

Instead, we'll let the Dense layer figure out what to do with the concatenated

feature maps. Listing 2.1.2, cnn-y-network-2.1.2.py shows the Y-Network

implementation using the Functional API:

import numpy as np

from keras.layers import Dense, Dropout, Input

from keras.layers import Conv2D, MaxPooling2D, Flatten

from keras.models import Model

from keras.layers.merge import concatenate

from keras.datasets import mnist

from keras.utils import to_categorical

[ 45 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!