You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In #1069, @kshitij12345 smartly pointed out that it's disturbing that these batch rules aren't caught by test_op_has_batch_rule. From looking at it, the bitwise ops in particular aren't being tested because the only allowed_dtype is torch.float
We expect this to lead to new failures. Please update the corresponding xfail list for the test.
i. In the case of test_op_has_batch_rule, if the failure looks to occur on an in-place function, please try first to only add it the inplace_failures list. If this does not work, you can xfail it
The text was updated successfully, but these errors were encountered:
In #1069, @kshitij12345 smartly pointed out that it's disturbing that these batch rules aren't caught by
test_op_has_batch_rule
. From looking at it, the bitwise ops in particular aren't being tested because the only allowed_dtype istorch.float
Steps
test_vmap
andtest_op_has_batch_rule
to have theirallowed_dtypes
(in the@ops
decorator) beOpDTypes.any_one
instead oftorch.float32
i. In the case of
test_op_has_batch_rule
, if the failure looks to occur on an in-place function, please try first to only add it theinplace_failures
list. If this does not work, you can xfail itThe text was updated successfully, but these errors were encountered: