Nice tutorial Illia Polosukhin. Thank you. Do you have any suggestions for exponential decay?
Nathan Bluvol
1

Thanks!

Good point, updated there with correct usage:

def get_optimizer():
def exp_decay(global_step):
return tf.train.exponential_decay(
learning_rate=0.00001, global_step=global_step,
decay_steps=100, decay_rate=0.97, staircase=True)
# use customized decay function in learning_rate
return tf.train.AdagradOptimizer(
learning_rate=exp_decay(tf.contrib.framework.get_global_step()))

or just:

def get_optimizer():
return tf.train.AdagradOptimizer(
learning_rate=tf.train.exponential_decay(
learning_rate=0.00001, global_step=tf.contrib.framework.get_global_step(),
decay_steps=100, decay_rate=0.97, staircase=True))

Hope this helps!

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.