Skip to content

Instantly share code, notes, and snippets.

@lucasastorian
Last active March 1, 2020 17:05
Show Gist options
  • Select an option

  • Save lucasastorian/6be75aaeabef629f3770d2d19d9c36b6 to your computer and use it in GitHub Desktop.

Select an option

Save lucasastorian/6be75aaeabef629f3770d2d19d9c36b6 to your computer and use it in GitHub Desktop.
Compiling and Training the Autoencoder
def lr_scheduler1(epoch, lr):
return lr * (0.995 ** epoch)
training_generator = model.DAEGenerator(features=X_train_all_standard,
batch_size=256,
swap_noise_rate=0.15)
autoencoder.compile(loss='mean_squared_error', optimizer=optimizers.Adam(lr=0.0005))
es = EarlyStopping(monitor='val_loss', mode='min', verbose=1, patience=20, min_delta=0.001)
lrs = LearningRateScheduler(lr_scheduler1, verbose=1)
cp = ModelCheckpoint(filepath='autoencoder.h5,
monitor='val_loss',
save_best_only=True,
verbose=0)
history = autoencoder.fit_generator(generator=training_generator,
validation_data=(X_val_standard, X_val_standard),
use_multiprocessing=False, max_queue_size=500,
callbacks=[es, lrs, cp], epochs=500)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment