Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enforce dim tags to be unique in tensor #632

Open
albertz opened this issue Sep 5, 2021 · 2 comments
Open

Enforce dim tags to be unique in tensor #632

albertz opened this issue Sep 5, 2021 · 2 comments

Comments

@albertz
Copy link
Member

albertz commented Sep 5, 2021

The order of axes should never matter.

But when a single dim tag can occur multiple times in a tensor (Data), it does matter. E.g. for operations like SoftmaxOverSpatialLayer on some input [B,T,T].

You can get such tensor e.g. via DotLayer.

We should disallow this, so that the order of axes will not matter.

The user explicitly would need to change one dim tag before to some new dim tag (e.g. via #633 or #589).

This would be helpful for #391.

@albertz
Copy link
Member Author

albertz commented Sep 23, 2021

This obviously introduces new behavior. So it could be introduced via a new behavior version (#508).

However, the question is whether this behavior should be enforced on everyone. A new behavior version usually implies for an easy transition of old code to new code. But this is a case where it might be a bit non-trivial in some cases.

Instead, we could introduce this as a behavior flag. Some config option like behavior_unique_dim_tags = True or so.

@albertz
Copy link
Member Author

albertz commented Jan 12, 2022

We now stumbled upon a case where there is no clear obvious solution on how to avoid ambiguous tags in a tensor, namely for the weight matrix of linear transformations with in_dim == out_dim. In this case, it would be a (dim, dim) weight matrix.

This is also shown here: rwth-i6/returnn_common#17 (comment)

One potential solution was to introduce some match_priority: #871

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant