Dynamic batch size support for TensorRT (#8526)
* Dynamic batch size support for TensorRT
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* Update export.py
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* Fix optimization profile when batch size is 1
* Warn users if they use batch-size=1 with dynamic
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* More descriptive assertion error
* Fix syntax
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* pre-commit formatting sucked
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* Update export.py
Co-authored-by:
Colin Wong <noreply@brains4drones.com>
Co-authored-by:
pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by:
Glenn Jocher <glenn.jocher@ultralytics.com>
正在显示
请
注册
或者
登录
后发表评论