最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

torch - Why the global_step (training step) is no sync with the wandb plot steps? - Stack Overflow

programmeradmin2浏览0评论

I'm using torch LightningModule trainer. I create trainer with:

 trainer  = pl.Trainer(max_epochs = 3)

Each train epoch has 511 steps (total = 1533) and each validation epoch has 127 steps.

I use the self.global_step in training_step func and log it (with wandb):

 wandb.log({"train_step": self.global_step})

Zoom:

1 As you can see, it seems that the train_step in wandb contains the training steps and the validation steps. why ?

2 How can I view just the training values (with the training steps) in wandb ?

发布评论

评论列表(0)

  1. 暂无评论