-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
【SCU】【Paddle TensorRT No.57】Add pd_op.temporal_shift
converter
#69848
base: develop
Are you sure you want to change the base?
Conversation
你的PR提交成功,感谢你对开源项目的贡献! |
return false; | ||
} | ||
#if IS_TRT_VERSION_LT(8200) | ||
VLOG(3) << "temporal_shift is not supported when TensorRT < 8.2"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
temporal_shift和TensorRT的版本有关联吗?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
旧版本的好像有看到,但是后面说默认tensorrt>8.4了,已去掉。
post_pad = add_1D_constant_layer(network, [0, 1, 0, 0, 0]) | ||
dims = 5 | ||
zeros = add_1D_constant_layer(network, [0] * dims) | ||
start = trt_sum(network, zeros, pre_pad) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里start应该是trt_sub
slice3_layer.set_input(2, slice_size3) | ||
|
||
if slice_c == 0: | ||
concat_inputs = [slice2_layer.get_output(0), slice3_layer.get_output(0)] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这个concat_inputs前面没定义的话,会报错吧,建议前面先定义一个
reshape_layer.reshape_dims = trt.Dims(inputs[0].shape) | ||
|
||
if data_format == "NHWC": | ||
transpose_layer = network.add_shuffle(reshape_layer.get_output(0)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这个改个名字吧,和前面的transpose_layer重名了,相当于重新赋值了
concat_layer.axis = 2 | ||
|
||
# Reshape output to [N*T,C,H,W] | ||
reshape_layer = network.add_shuffle(concat_layer.get_output(0)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里和上面也重名,改成和旧ir相同的名字吧
self.min_shape = {"x": [4, 9, 7, 7]} | ||
self.max_shape = {"x": [8, 9, 7, 7]} | ||
|
||
def test_trt_result(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这个加一个fp16测试吧
PR Category
User Experience
PR Types
New features
Description
新增了
pd_op.temporal_shift
Marker和Converter