09.05.2023 Views

pdfcoffee

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Advanced Convolutional Neural Networks

# Pad sequences with max_len

X_train = preprocessing.sequence.pad_sequences(X_train,

maxlen=max_len)

X_test = preprocessing.sequence.pad_sequences(X_test, maxlen=max_len)

return (X_train, y_train), (X_test, y_test)

Then we build a suitable CNN model. We use Embeddings (see Chapter 9,

Autoencoders) to map the sparse vocabulary typically observed in documents into a

dense feature space of dimension dim_embedding. Then we use a Conv1D, followed

by a GlobalMaxPooling1D for averaging, and two Dense layers – the last one has

only one neuron firing binary choices (positive or negative reviews):

def build_model():

model = models.Sequential()

# Input - Embedding Layer

# the model will take as input an integer matrix of size

# (batch, input_length)

# the model will output dimension (input_length, dim_embedding)

# the largest integer in the input should be no larger

# than n_words (vocabulary size).

model.add(layers.Embedding(n_words,

dim_embedding, input_length=max_len))

model.add(layers.Dropout(0.3))

model.add(layers.Conv1D(256, 3, padding='valid',

activation='relu'))

# takes the maximum value of either feature vector from each of

# the n_words features

model.add(layers.GlobalMaxPooling1D())

model.add(layers.Dense(128, activation='relu'))

model.add(layers.Dropout(0.5))

model.add(layers.Dense(1, activation='sigmoid'))

return model

(X_train, y_train), (X_test, y_test) = load_data()

model=build_model()

model.summary()

[ 176 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!