From 66508ff4b7d49264e37d3e8926fa6e39bcb1217c Mon Sep 17 00:00:00 2001 From: Nishant Dahal <50732732+NishantDahal@users.noreply.github.com> Date: Mon, 30 Sep 2024 22:14:21 +0545 Subject: [PATCH] docs: add note for `TQDMProgressBar` (#20198) * Add documentation note for TQDMProgressBar --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> --- docs/source-pytorch/common/progress_bar.rst | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/docs/source-pytorch/common/progress_bar.rst b/docs/source-pytorch/common/progress_bar.rst index e0c29fccdc..106c2289e5 100644 --- a/docs/source-pytorch/common/progress_bar.rst +++ b/docs/source-pytorch/common/progress_bar.rst @@ -36,6 +36,10 @@ You can update ``refresh_rate`` (rate (number of batches) at which the progress trainer = Trainer(callbacks=[TQDMProgressBar(refresh_rate=10)]) +.. note:: + + The ``smoothing`` option has no effect when using the default implementation of :class:`~lightning.pytorch.callbacks.TQDMProgressBar`, as the progress bar is updated using the ``bar.refresh()`` method instead of ``bar.update()``. This can cause the progress bar to become desynchronized with the actual progress. To avoid this issue, you can use the ``bar.update()`` method instead, but this may require customizing the :class:`~lightning.pytorch.callbacks.TQDMProgressBar` class. + By default the training progress bar is reset (overwritten) at each new epoch. If you wish for a new progress bar to be displayed at the end of every epoch, set :paramref:`TQDMProgressBar.leave ` to ``True``.