最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

logging - Log valdiation and training loss at the same number of steps in Pytorch Lightning - Stack Overflow

programmeradmin4浏览0评论

I'm rather new to lightning, and I try to do the very simple thing: each 1000 steps (not epoch) evaluate the train and validation loss and log that.

I don't seem to be able to that with the Trainer class since val_check_interval is based on epochs and log_every_n_steps is based on steps. Am I missing something? I would assume that log_every_n_steps would do exactly that but it doesn't seem like

发布评论

评论列表(0)

  1. 暂无评论