Unverified 提交 a88a8146 authored 作者: Nick Martin's avatar Nick Martin 提交者: GitHub

Copy wandb param dict before training to avoid overwrites (#7317)

* Copy wandb param dict before training to avoid overwrites. Copy the hyperparameter dict retrieved from wandb configuration before passing it to `train()`. Training overwrites parameters in the dictionary (eg scaling obj/box/cls gains), which causes the values reported in wandb to not match the input values. This is confusing as it makes it hard to reproduce a run, and also throws off wandb's Bayesian sweep algorithm. * Cleanup Co-authored-by: 's avatarGlenn Jocher <glenn.jocher@ultralytics.com>
上级 245d6459
......@@ -16,8 +16,8 @@ from utils.torch_utils import select_device
def sweep():
wandb.init()
# Get hyp dict from sweep agent
hyp_dict = vars(wandb.config).get("_items")
# Get hyp dict from sweep agent. Copy because train() modifies parameters which confused wandb.
hyp_dict = vars(wandb.config).get("_items").copy()
# Workaround: get necessary opt args
opt = parse_opt(known=True)
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论