gpt4 book ai didi

pytorch - 如何在 Pytorch Lightning 中禁用进度条

转载 作者:行者123 更新时间:2023-12-02 00:38:40 27 4
gpt4 key购买 nike

我对 Pytorch Lightning 中的 tqdm 进度条有很多问题:

  • 当我在终端中运行训练时,进度条会自行覆盖。在训练时期结束时,验证进度条会打印在训练栏下方,但当训练结束时,下一个训练时期的进度条将打印在上一个训练时期的进度条之上。因此不可能看到之前时期的损失。
INFO:root:  Name    Type Params
0 l1 Linear 7 K
Epoch 2: 56%|████████████▊ | 2093/3750 [00:05<00:03, 525.47batch/s, batch_nb=1874, loss=0.714, training_loss=0.4, v_nb=51]
  • 由于某些损失的小数点后面的位数发生变化,导致进度条从左向右摆动。
  • 在Pycharm中运行时,不会打印验证进度条,而是打印
INFO:root:  Name    Type Params
0 l1 Linear 7 K
Epoch 1: 50%|█████ | 1875/3750 [00:05<00:05, 322.34batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1: 50%|█████ | 1879/3750 [00:05<00:05, 319.41batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1: 52%|█████▏ | 1942/3750 [00:05<00:04, 374.05batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1: 53%|█████▎ | 2005/3750 [00:05<00:04, 425.01batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1: 55%|█████▌ | 2068/3750 [00:05<00:03, 470.56batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1: 57%|█████▋ | 2131/3750 [00:05<00:03, 507.69batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1: 59%|█████▊ | 2194/3750 [00:06<00:02, 538.19batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1: 60%|██████ | 2257/3750 [00:06<00:02, 561.20batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1: 62%|██████▏ | 2320/3750 [00:06<00:02, 579.22batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1: 64%|██████▎ | 2383/3750 [00:06<00:02, 591.58batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1: 65%|██████▌ | 2445/3750 [00:06<00:02, 599.77batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1: 67%|██████▋ | 2507/3750 [00:06<00:02, 605.00batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1: 69%|██████▊ | 2569/3750 [00:06<00:01, 607.04batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]
Epoch 1: 70%|███████ | 2633/3750 [00:06<00:01, 613.98batch/s, batch_nb=1874, loss=1.534, training_loss=1.72, v_nb=49]

我想知道这些问题是否可以解决,或者如何禁用进度栏,而只是在屏幕上打印一些日志详细信息。

最佳答案

仅供引用show_progress_bar=False 自版本 0.7.2 起已弃用,但您可以使用 progress_bar_refresh_rate=0

<小时/>

更新:

progress_bar_refresh_rate 已在 v1.5 中弃用,并将在 v1.7 中删除。要禁用进度栏,请将 enable_progress_bar 设置为 false

progress_bar_refresh_rate: How often to refresh progress bar (in steps). Value ``0`` disables progress bar.
Ignored when a custom progress bar is passed to :paramref:`~Trainer.callbacks`. Default: None, means
a suitable value will be chosen based on the environment (terminal, Google COLAB, etc.).

.. deprecated:: v1.5
``progress_bar_refresh_rate`` has been deprecated in v1.5 and will be removed in v1.7.
Please pass :class:`~pytorch_lightning.callbacks.progress.TQDMProgressBar` with ``refresh_rate``
directly to the Trainer's ``callbacks`` argument instead. To disable the progress bar,
pass ``enable_progress_bar = False`` to the Trainer.

enable_progress_bar: Whether to enable to progress bar by default.

关于pytorch - 如何在 Pytorch Lightning 中禁用进度条,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/59455268/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com