-
Notifications
You must be signed in to change notification settings - Fork 47
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove MLActivations definitely not usable with recurrent ops #703
Remove MLActivations definitely not usable with recurrent ops #703
Conversation
@huningxin FYI. This should simplify https://crrev.com/c/5495877 This list of operators is based on this code in Chromium's DML implementation. If there are others you think should be added to this PR, please let me know. I started with a minimal list to make this PR as uncontroversial as possible :) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍
Ack!
Unfortunately that implementation may be mis-leading. They are not supported because they are only available in higher DirectML feature level, rather than they are not useful. I opened a Chromium issue regarding to that.
I suppose the current criteria is to keep activations that could be used by I don't find DirectML RNN ops supported activations list in MSDN, for example DML_GRU_OPERATOR_DESC. Did I miss anything? |
Added back
Yup., that's why I was using the Chromium implementation as the source of truth. It would be nice to have these constraints documented! :P Please merge at your convenience |
Indeed - I'll follow up with our doc writer @stevewhims. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
SHA: f6a9af1 Reason: push, by huningxin Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
@huningxin : Until the docs are updated: ✅: ❌: |
clamp() and softmax() may no longer be created as MLActivations More details on the spec PR: webmachinelearning/webnn#703 Bug: 341518634 Cq-Include-Trybots: luci.chromium.try:win11-blink-rel,mac14.arm64-blink-rel,mac14-blink-rel Change-Id: I7dda713c8be5454690f21c66665d90be274513f7
clamp() and softmax() may no longer be created as MLActivations More details on the spec PR: webmachinelearning/webnn#703 Bug: 341518634 Cq-Include-Trybots: luci.chromium.try:win11-blink-rel,mac14.arm64-blink-rel,mac14-blink-rel Change-Id: I7dda713c8be5454690f21c66665d90be274513f7
clamp() and softmax() may no longer be created as MLActivations More details on the spec PR: webmachinelearning/webnn#703 Bug: 341518634 Change-Id: I7dda713c8be5454690f21c66665d90be274513f7
clamp() and softmax() may no longer be created as MLActivations More details on the spec PR: webmachinelearning/webnn#703 Bug: 341518634 Change-Id: I7dda713c8be5454690f21c66665d90be274513f7 Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/5598089 Commit-Queue: Austin Sullivan <[email protected]> Reviewed-by: Alex Gough <[email protected]> Reviewed-by: ningxin hu <[email protected]> Cr-Commit-Position: refs/heads/main@{#1313205}
clamp() and softmax() may no longer be created as MLActivations More details on the spec PR: webmachinelearning/webnn#703 Bug: 341518634 Change-Id: I7dda713c8be5454690f21c66665d90be274513f7 Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/5598089 Commit-Queue: Austin Sullivan <[email protected]> Reviewed-by: Alex Gough <[email protected]> Reviewed-by: ningxin hu <[email protected]> Cr-Commit-Position: refs/heads/main@{#1313205}
…e with recurrent ops, a=testonly Automatic update from web-platform-tests webnn: Remove some activations not usable with recurrent ops clamp() and softmax() may no longer be created as MLActivations More details on the spec PR: webmachinelearning/webnn#703 Bug: 341518634 Change-Id: I7dda713c8be5454690f21c66665d90be274513f7 Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/5598089 Commit-Queue: Austin Sullivan <[email protected]> Reviewed-by: Alex Gough <[email protected]> Reviewed-by: ningxin hu <[email protected]> Cr-Commit-Position: refs/heads/main@{#1313205} -- wpt-commits: a632273a65dd5e8ba5f76ae0d03e6cb4f3affeee wpt-pr: 46678
…e with recurrent ops, a=testonly Automatic update from web-platform-tests webnn: Remove some activations not usable with recurrent ops clamp() and softmax() may no longer be created as MLActivations More details on the spec PR: webmachinelearning/webnn#703 Bug: 341518634 Change-Id: I7dda713c8be5454690f21c66665d90be274513f7 Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/5598089 Commit-Queue: Austin Sullivan <[email protected]> Reviewed-by: Alex Gough <[email protected]> Reviewed-by: ningxin hu <[email protected]> Cr-Commit-Position: refs/heads/main@{#1313205} -- wpt-commits: a632273a65dd5e8ba5f76ae0d03e6cb4f3affeee wpt-pr: 46678
…e with recurrent ops, a=testonly Automatic update from web-platform-tests webnn: Remove some activations not usable with recurrent ops clamp() and softmax() may no longer be created as MLActivations More details on the spec PR: webmachinelearning/webnn#703 Bug: 341518634 Change-Id: I7dda713c8be5454690f21c66665d90be274513f7 Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/5598089 Commit-Queue: Austin Sullivan <asullychromium.org> Reviewed-by: Alex Gough <ajgochromium.org> Reviewed-by: ningxin hu <ningxin.huintel.com> Cr-Commit-Position: refs/heads/main{#1313205} -- wpt-commits: a632273a65dd5e8ba5f76ae0d03e6cb4f3affeee wpt-pr: 46678 UltraBlame original commit: 23cb27f7add77c95c93a44c1676157548f31abac
Follow-up to #664, without going so far as #689
Prior to #664, an
MLActivation
may be used for either op fusion or with recurrent operators (lstm
andgru
). Now it's only the latter, and some operators which may be created as anMLActivation
no longer make sense to be passed as activations:clamp
(no longer being removed after feedback from reviewers)gelu
softmax
Note that this PR is not a replacement for #689. That issue still stands on its own. Until we reach consensus on that issue, we should remove the
MLActivation
s which were clearly targeted at the op fusion use case.Preview | Diff