Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

sat 模型中的 lm_head 和 transformer.word_embeddings 有什么区别 #527

Open
GondorFu opened this issue Sep 24, 2024 · 0 comments
Open

Comments

@GondorFu
Copy link

GondorFu commented Sep 24, 2024

transformer.word_embeddings 在代码中的功能是计算最开始将token id转成embedding,最后输出计算token的feature相似度
lm_head呢?没找到具体的使用位置

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant