-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: NeuroConv dev tests - Last axis of data shapes is None #1111
Comments
@mavaylon1 could you look into this? We can revert that PR if needed. |
|
It seems that this may be too "agressive". If the schema specifies a dimension as fixed in size, then we should not set it to None. I.e, we should only make the dimensions expandable that the schema allows to be expandable. Is this information somehow communicated to the backend in the Builder so that we could adjust the logic that was added in #1093 to only make only dimensions that are not fixed in the schema expandable? |
The backend does not have direct access to the schema associated with a builder and is intentionally siloed from the schema. |
Let me see if I understand. As for the DCI, having a parameter that is a bool in For the data that is fixed in size in the schema, I would need to give this more thought. @rly Thoughts? |
@mavaylon1 Yes, the validator should validate using actual shape not maxshape. |
TODO:
|
@rly I am thinking about the problem for shapes defined in the schema. How are these allowed to be written? By setting maxshape right at the end in dset, I think we are skipping shape checks that would've prevented the data to be written in the first place. This leads to on read throwing a fit. This assumes there is a check. |
I think if on write we use the shape from the schema, this should leave read alone. |
I think shape is validated before write in the docval of init of the particular container class. If there is a custom container class, then the shape validation in docval is supposed to be consistent with the schema. If the container class is auto-generated, then the shape validation parameters in docval are generated from the schema. I'm not sure if the shape is being validated elsewhere. It is on the todo list to run the validator before write though. If One edge case is that the shape in the schema can be a list of shape options, e.g., images can have shape |
Yeah I believe the shape is validated in docval. What I was thinking about was the goal you mentioned of having the shape be validated prior to write. |
Note: we need to consider this when working on implementing extendable datasets in HDMF again @mavaylon1 |
What happened?
I believe the merge of #1093 broke some things on NeuroConv
Mainly suspect that because it seems to be the only PR that was merged in last 2 days, and our dev tests were passing fine before then: https://github.com/catalystneuro/neuroconv/actions/workflows/dev-testing.yml
It might be advantageous to setup some dev testing of NeuroConv here on HDMF to ensure PRs don't have ripple effects throughout the ecosystem (for example, NWB Inspector tests against both dev PyNWB and downstream DANDI to ensure both upward and downward compatibility)
The full log: https://github.com/catalystneuro/neuroconv/actions/runs/9006012395/job/24742623348?pr=845
Seems to be affecting all interfaces, caught during roundtrip stage of testing (files seem to write just fine, but don't read back)
Final line of traceback might be the most informative - some shape property has become
None
instead of a finite value (which seems to be expected)Steps to Reproduce
test parameters (HDF5 datasets might have closed following pytest grabbing info)
Traceback
Operating System
macOS
Python Executable
Conda
Python Version
3.12
Package Versions
No response
The text was updated successfully, but these errors were encountered: