Unverified 提交 950a85d9 authored 作者: Glenn Jocher's avatar Glenn Jocher 提交者: GitHub

TensorRT PyTorch Hub inference fix (#7560)

Solution proposed in https://github.com/ultralytics/yolov5/issues/7128 to TRT PyTorch Hub CUDA illegal memory errors.
上级 c16671fc
...@@ -531,7 +531,7 @@ class AutoShape(nn.Module): ...@@ -531,7 +531,7 @@ class AutoShape(nn.Module):
# multiple: = [Image.open('image1.jpg'), Image.open('image2.jpg'), ...] # list of images # multiple: = [Image.open('image1.jpg'), Image.open('image2.jpg'), ...] # list of images
t = [time_sync()] t = [time_sync()]
p = next(self.model.parameters()) if self.pt else torch.zeros(1) # for device and type p = next(self.model.parameters()) if self.pt else torch.zeros(1, device=self.model.device) # for device, type
autocast = self.amp and (p.device.type != 'cpu') # Automatic Mixed Precision (AMP) inference autocast = self.amp and (p.device.type != 'cpu') # Automatic Mixed Precision (AMP) inference
if isinstance(imgs, torch.Tensor): # torch if isinstance(imgs, torch.Tensor): # torch
with amp.autocast(autocast): with amp.autocast(autocast):
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论