You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Validation tests for training and metrics evaluation are launched with pytest. However, because they require the usage of GPU they're not suitable to run during CI because GPUs available on Github-hosted runners are scarcely-available.
In release-2.0.0., the validation tests are skipped with the pytest.mark.skip decorator. Instead we can use the decorator pytest.mark.skipif, adding a conditional to be evaluated that will skip the test only during CI.
The text was updated successfully, but these errors were encountered:
Validation tests for training and metrics evaluation are launched with
pytest
. However, because they require the usage of GPU they're not suitable to run during CI because GPUs available on Github-hosted runners are scarcely-available.In release-2.0.0., the validation tests are skipped with the
pytest.mark.skip
decorator. Instead we can use the decoratorpytest.mark.skipif
, adding a conditional to be evaluated that will skip the test only during CI.The text was updated successfully, but these errors were encountered: