16.03.2021 Views

Advanced Deep Learning with Keras

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Introducing Advanced Deep Learning with Keras

Figure 1.5.4: Diagram of LSTM. The parameters are not shown for clarity

There are many other ways to configure RNNs. One way is making an RNN

model that is bidirectional. By default, RNNs are unidirectional in the sense that

the current output is only influenced by the past states and the current input.

In bidirectional RNNs, future states can also influence the present state and the

past states by allowing information to flow backward. Past outputs are updated

as needed depending on the new information received. RNNs can be made

bidirectional by calling a wrapper function. For example, the implementation

of bidirectional LSTM is Bidirectional(LSTM()).

For all types of RNNs, increasing the units will also increase the capacity. However,

another way of increasing the capacity is by stacking the RNN layers. You should

note though that as a general rule of thumb, the capacity of the model should only

be increased if needed. Excess capacity may contribute to overfitting, and as a result,

both longer training time and slower performance during prediction.

[ 36 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!