Skip to content
This repository has been archived by the owner on Apr 14, 2021. It is now read-only.

global_step always equals to zero #24

Open
MYWmiss opened this issue Jun 13, 2019 · 2 comments
Open

global_step always equals to zero #24

MYWmiss opened this issue Jun 13, 2019 · 2 comments

Comments

@MYWmiss
Copy link

MYWmiss commented Jun 13, 2019

No description provided.

@MYWmiss
Copy link
Author

MYWmiss commented Jun 13, 2019

tensorboard
the point of loss float up and down the Y axis

@yylun
Copy link
Contributor

yylun commented Jun 24, 2019

@MYWmiss yes, we miss the global_step parameter in defining train_op.

To make global_step increase, take BiDAF as example, change https://github.com/sogou/SMRCToolkit/blob/master/sogou_mrc/model/bidaf.py#L214-L216 to

def compile(self, optimizer, initial_lr):
    self.optimizer = optimizer(initial_lr)
    global_step = tf.train.get_or_create_global_step()
    self.train_op = self.optimizer.minimize(self.loss, global_step=gloal_step)

Thx for pointing it out, will fix soon 😃

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants