Skip to content

Instantly share code, notes, and snippets.

@jaeoh2
Created December 8, 2016 00:26
Show Gist options
  • Select an option

  • Save jaeoh2/5be7120c4eee628b43c9a01abbb39d58 to your computer and use it in GitHub Desktop.

Select an option

Save jaeoh2/5be7120c4eee628b43c9a01abbb39d58 to your computer and use it in GitHub Desktop.
keras RNN model (TimeDistributed wrapper)
#build model many to one
embedding_vector_length = 32
model = Sequential()
model.add(Embedding(input_dim=top_words,
output_dim=embedding_vector_length,
input_length=max_review_length))
model.add(LSTM(100, return_sequences=True))
model.add(Dropout(0.2))
model.add(LSTM(100, return_sequences=True))
model.add(Dropout(0.2))
model.add(TimeDistributed(Dense(1,activation='sigmoid')))
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
print(model.summary())
'''
____________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
====================================================================================================
embedding_15 (Embedding) (None, 500, 32) 160000 embedding_input_15[0][0]
____________________________________________________________________________________________________
lstm_28 (LSTM) (None, 500, 100) 53200 embedding_15[0][0]
____________________________________________________________________________________________________
dropout_28 (Dropout) (None, 500, 100) 0 lstm_28[0][0]
____________________________________________________________________________________________________
lstm_29 (LSTM) (None, 500, 100) 80400 dropout_28[0][0]
____________________________________________________________________________________________________
dropout_29 (Dropout) (None, 500, 100) 0 lstm_29[0][0]
____________________________________________________________________________________________________
timedistributed_7 (TimeDistribut (None, 500, 1) 101 dropout_29[0][0]
====================================================================================================
Total params: 293701
____________________________________________________________________________________________________
None
RNN outputs not flatten (timedistributed_7 layer has (Nonex500x1)
Why keep each timestep values separate? Because:
you're only want to interacting the values between its own timestep
you don't want to have a random interaction between different timesteps and channels.
*R.Luthfianto. Prediksi Struktur Sekunder Protein menggunakan Konvolusi dan Bidirectional Gated Recurrent Unit. Undergraduate thesis, Universitas Gadjah Mada, Yogyakarta, 2016.
'''
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment