Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add optimizer key in EmbeddingFusedOptimizer and KeyValueEmbeddingFusedOptimizer. #2461

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

tristanhy
Copy link

Summary:
Add field optimizer_key in Torchrec EmbeddingFusedOptimizer. During the initialization of embedding module BatchedFusedEmbeddingBag, pass the optimizer_key information from fused parameters when creating the EmbeddingFusedOptimizer.

In sparse arch, update the optimizer key for TBE when creating a combined optimizer for different TBE modules. The optimizer_key in the combined fused optimizer can be extracted for learning rate warm up processing.

Differential Revision: D61668809

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Sep 30, 2024
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D61668809

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D61668809

tristanhy pushed a commit to tristanhy/torchrec that referenced this pull request Sep 30, 2024
…edOptimizer. (pytorch#2461)

Summary:
Pull Request resolved: pytorch#2461

Add field optimizer_key in Torchrec EmbeddingFusedOptimizer. During the initialization of embedding module BatchedFusedEmbeddingBag, pass the optimizer_key information from fused parameters when creating the EmbeddingFusedOptimizer.

In sparse arch, update the optimizer key for TBE when creating a combined optimizer for different TBE modules. The optimizer_key in the combined fused optimizer can be extracted for learning rate warm up processing.

Differential Revision: D61668809
…edOptimizer (pytorch#2461)

Summary:
Pull Request resolved: pytorch#2461

Add field optimizer_key in Torchrec EmbeddingFusedOptimizer. During the initialization of embedding module BatchedFusedEmbeddingBag, pass the optimizer_key information from fused parameters when creating the EmbeddingFusedOptimizer.

In sparse arch, update the optimizer key for TBE when creating a combined optimizer for different TBE modules. The optimizer_key in the combined fused optimizer can be extracted for learning rate warm up processing.

Reviewed By: really121

Differential Revision: D61668809
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D61668809

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants