How to set symmetric zero points for TRT deployment #903
-
I read in the docs that "Quantization zero points are set to be asymmetric for activations and symmetric for weights". I trained a yolov5l quantized model using recipe and found it to be true. I also found out that there is a "QuantizationModifier" class that has a "tensorrt" param that alters the zero points settings. I want to deploy my model with TensorRT which requires symmetric zero points. Is incorporating QuantizationModifier class into the current training procedure the suggested way? How should I go about it? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Yes, currently the best supported pathway is to set |
Beta Was this translation helpful? Give feedback.
Yes, currently the best supported pathway is to set
tensorrt: True
for anyQuantizationModifier
in your sparsification recipe