最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

deep learning - Lightning Tensorboard Hparams tab not showing custom metric - Stack Overflow

programmeradmin0浏览0评论

I'm training a neural network built with pyTorch Lightning and I'm trying to have the HParams tab working in tensorboard.

Following the official outdated guide I initialize the logger with default_hp_metric=False:

logger = TensorBoardLogger("lightning_logs", name=modelName, default_hp_metric=False)

I set once the hyper parameters defining a custom metric:

logger.log_hyperparams(settings, metrics = {'val_maxroa': 0})

and I log to val_maxroa at the end of each validation step in the LightningModule:

 def validation_step(self, batch, batch_idx):
        inputs, target = batch
        # ...Things...
        self.log("val_maxroa", maxRoa)

On Tensorboard I can see the metric plotted in the "Time series" and "Scalars" tabs:

But in the HParams tab is not visible:

I checked also the official documentation but it's not very detailed for this aspect.

Do you have any hint?

发布评论

评论列表(0)

  1. 暂无评论